Two things..

Using the Perl API means you have to be able to login to each domain, and
you can't retrieve a full list of domains from the API (yet) - that wouldn't
work.  Gotta use the web interface

Also, it may not take more than a day - we just can't spare a programmer to
do that yet.  There are other implications - prevent hammering?  Do we give
customized formats? (i.e. selectable fields to report)  What information do
we provide?  Simple contact information, or transaction audit? (audit being
more expensive to the system than just plain info) - many questions.  We
have to design it, code it, test it, train internally, then release it - and
fix bugs along the way, hopefully not tooooooo many by the time it gets to
you folk.  This'll take more than a day :)

Charles Daminato
Product Manager (ccTLDs)
Tucows Inc. - [EMAIL PROTECTED]

> -----Original Message-----
> From: Csongor Fagyal [mailto:[EMAIL PROTECTED]]
> Sent: March 7, 2001 2:48 PM
> To: Charles Daminato; Tom Brown; [EMAIL PROTECTED]
> Subject: Re: How to retrive a list of all our customers domains
>
>
> Tom: How about using the Perl API? Isn't that easier to use then emulating
> https calls?
>
> Charles: It seems to me that setting up one CGI script which
> gives back only
> the most important data in some delimited text format should not take more
> than a day, once someone is on the server-side and is familiar with the
> system...
>
> - Csongor
>
> ----- Original Message -----
> From: "Charles Daminato" <[EMAIL PROTECTED]>
> To: "Tom Brown" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
> Sent: Wednesday, March 07, 2001 8:20 PM
> Subject: RE: How to retrive a list of all our customers domains
>
>
> > As long as you don't try to get all 10,000 of your domains (and info) at
> > once, we probably won't notice.  The logic being, you haven't kept track
> > until now - if it takes a week to parse, let it do so...
> >
> > We're painfully aware of the lack of "data dumps" our system
> currently has
> > for our resellers.  We WILL get to it, but it's a back burner issue.  If
> the
> > data is THAT important, it should be tracked locally - it's when you
> decide
> > to start tracking (after some time) that becomes tricky.  It'll
> happen...
> > until then, Tom's tricks will help (thanks Tom!) - as long as
> you remember
> > to be patient ;)
> >
> > Charles Daminato
> > Product Manager (ccTLDs)
> > Tucows Inc. - [EMAIL PROTECTED]
> >
> > > -----Original Message-----
> > > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> > > Behalf Of Tom Brown
> > > Sent: March 7, 2001 1:55 PM
> > > To: [EMAIL PROTECTED]
> > > Subject: RE: How to retrive a list of all our customers domains
> > >
> > >
> > >
> > > did you guys break the sslbot again? (doesn't look like it.
> :) I had to
> > > write another one for a different supplier and came up with a
> much more
> > > robust page parsing algorithm... :-) [split out the tables, then split
> out
> > > the rows... then parse the rows] Almost looking forward to rewriting
> that
> > > sslbot script :-)
> > >
> > > Considering that forcing clients to write robot scripts to pound your
> > > servers into the dirt is a just generally a bad idea, and a
> simple text
> > > output screen to dump everything would be _trivial_ to
> write... you guys
> > > don't make this easy... (I know we've discussed this before,
> and I don't
> > > hold you responsible Chuck.... :)
> > >
> > > That said, I have no real desire to grab all the customer
> fields, and I
> > > would probably do that via a whois lookup instead anyway...
> much easier
> to
> > > parse text output than HTML, and it should be faster than using an SSL
> > > connection.
> > >
> > > Taking it from that point of view, the sslbot link shown
> below is a fine
> > > starting point, parsing whois output is trivial compared to
> handling the
> > > RWI authentication and parsing HTML ... e.g. use the sslbot to get the
> > > list of domains, then write a short script to hit the whois server for
> > > each domain listed by the sslbot.
> > >
> > > the only gotcha there is if OpenSRS gets upset because you pounded on
> > > their whois server too many times.
> > >
> > > -Tom
> > >
> > > (p.s. anyone who wants to respond to me, please send To: me,
> and cc: the
> > > list if you want... I don't generally read the list except for OpenSRS
> > > staff postings, but the filters will pickup directly address mail...)
> > >
> > > On Wed, 7 Mar 2001, Charles Daminato wrote:
> > >
> > > > You can use a variation of a script found here:
> > > >
> > > > http://www.opensrs.org/archives/dev-list/0102/0156.html
> > > >
> > > > or an older version (not using curl) here:
> > > >
> > > > http://www.opensrs.org/archives/discuss-list/0012/0147.html
> > > >
> > > > The second script doesn't work as is - it'll take some tweaking, but
> the
> > > > core pieces are there.
> > > >
> > > > Charles Daminato
> > > > Product Manager (ccTLDs)
> > > > Tucows Inc. - [EMAIL PROTECTED]
> > > >
> > > > > -----Original Message-----
> > > > > From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED]]On
> > > > > Behalf Of Guy Baconnière
> > > > > Sent: March 7, 2001 12:33 PM
> > > > > To: [EMAIL PROTECTED]
> > > > > Subject: How to retrive a list of all our customers domains
> > > > >
> > > > > Hi,
> > > > >
> > > > > We want to import into our SQL database all contacts fields
> > > and billing
> > > > > informations of all customers domains registered.
> > > > >
> > > > > We cannot have access to the list of all domains by the
> API without
> > > > > having the login/password of each customer. We need a
> "reseller" API
> > > > > that can give us access to all informations available manually on
> > > > > the reseller web
> > > > > interface (http://resellers.opensrs.net/) like contacts fields,
> > > > > billing status, etc.
> > > > >
> > > > > Is there any way to do that directly by perl script without
> > > > > having our customer
> > > > > user and password ?
> > >
> > >
> >
> >
>

Reply via email to