[consult] Additional background information

michael.dillon at bt.com michael.dillon at bt.com
Sun Mar 18 10:15:57 EDT 2007

> There are some very large organizations, some of whom happen 
> to be our 
> customers that have >256 resources.  To be more specific >256 
> netblocks, 
> with no way to further narrow down the search. 

I think this is the key thing. Whois was not designed as a search tool
and therefore did not make use of the prior work done in the search
field. Nowadays there are numerous examples of search tools which DO
provide ways to narrow down the search. Even Google allows this because
you can add search terms, and you can also subtract search terms by
using -keyword.

OpenFTS, htdig, SWISH++, Nutch, Xapian and so on, are open source search
tools which could be implemented to supplement the whois lookups when
result-sets exceed some limit like 256. Imagine that the result set is
calculated but not sent to the user. Instead, it is fed locally into a
search engine and the user is given a ticket to narrow down their search
for some time period.

>  However I 
> haven't had any problems requesting the information from ARIN 
> recently. 
> At the time I submitted the request to the consultation and 
> suggestion 
> process, to alter or remove the response limit, I was unaware 
> that the 
> full query must still be performed in order to be able to 
> page through 
> results.  Having learned that, I became rather resigned to having to 
> request the info from ARIN each time I need it.  

In essence, the current whois search tool refers the user to the
hostmaster who finds some human being to calculate the entire result set
locally. Then they either send the entire result set to the end user, or
use some local tools to narrow down the search. The question here is
where the narrowing down should be done and whether or not human beings
have to be in the loop for searches with large result sets. My view is
that more modern and sophisticated tools will make this problem go away,
even if we maintain the limit of 256 results as a general rule.

Of course, there is another way that allows us to keep the creaky old
whois engine and put the extra load on the endusers. That is to allow
people to sign up for special service where they can do whois queries
with no limits. That way an end user can login, do a long query, save it
locally and use local tools to narrow down the search.

> That said, I would like to see the limit raised or a way 
> found to remove 
> it.  It's a little thing, it probably doesn't affect a 
> significant number 
> of people, but it would make life a little easier, the day 
> run a little 
> smoother. 

It is also generally a good thing for ARIN's public image if it is seen
to be cleaning up dusty corners of its domain. The processes and tools
inherited from the Internic were antiquated and poorly designed, even by
the standards of the 1990s. ARIN has done a lot of work in fixing this,
much of it invisible to outsiders. But I believe that ARIN should not
rest on their laurels but should continue fixing and replacing these

--Michael Dillon

P.S. even if there is no per-query limit to logged-in users, there
should still be a daily limit, and some way of recovering any records
that were truncated due to that limit. I.e. I login on Monday and run a
query which would return 26,000 records but the daily limit is 10,000.
On Tuesday I login and rerun the query requesting results >10,000. And
on Wednesday I log in again and request results >20,000. This may be
slightly annoying but it is justified to prevent data mining and likely
to be infrequently encountered by real users.

More information about the ARIN-consult mailing list