On Fri, Aug 14, 2009 at 2:27 AM, Dr. David
Kirkby<david.kir...@onetel.net> wrote:
>
> William Stein wrote:
>> On Thu, Aug 13, 2009 at 1:55 PM, Peter
>> Jeremy<peterjer...@optushome.com.au> wrote:
>>> On 2009-Aug-11 15:29:29 -0700, William Stein <wst...@gmail.com> wrote:
>>>> I just wanted to let people know that David Ackerman -- a UW student who
>>>> took my course on Sage last quarter -- is working (funded by NSF) on
>>>> creating a "units package" for Sage right _now_.
>>> Since no-one else has mentioned it, I presume you are aware that there
>>> is a 'units' tool included with Unix that does most of this.  It looks
>>> like it's optional on Linux (at least boxen knows it exists but doesn't
>>> have it installed).  It's standard on FreeBSD (and the underlying
>>> constants were worked through fairly recently to bring them into line
>>> with the latest values).
>>>
>>
>> I "apt-get install" it so it is now on sage.math.
>>
>> Regarding interval arithmetic, the Sage symbolic manipulation package
>> works with intervals, so it would be straightforward at any point in
>> the future to have unit conversion ratios be intervals if one wanted
>> that.
>>
>> I would find USD and EU conversion useful when I'm traveling.   Then
>> the conversion should be looked up somehow using urllib say when the
>> program starts or every few minutes.
>>
>> William
>
>
> Mathematica can look up conversions. It also has an historical database
> of conversion rates.
>
> But I doubt anyone would seriously use MMA to look up today's exchange
> rate. Is somewhere like: http://www.xe.com/ not easier?  I suppose there
> is some argument to being able to download the latest data each time you
> connect to the internet, so when on a plane with no WiFi, you will have
> reasonably recent data. But certainly if I had internet access, I'd use
> http://www.xe.com/ not any maths software.
>
> I've tried to parse the odd web site in the past. The problem is it
> tends to break over time, as webmasters make small changes. One site I
> used to connect to had chess games in a zip file. Then they started
> putting the games in a directory, not at the top level. It was
> relativity easy to fix. But I've had similar experiences parsing web sites.

Google Finance and Yahoo both have http API's for getting all kinds of
financial data.  They are very stable.  Sage has an interface to them
for stock data, which was written about 1.5 years ago, and hasn't
needed updating since.   Also, the API can be tested regularly before
Sage releases.

Sage has some code for pulling in data and scraping web pages.
E.g., three that come to mind include

    magma_free -- run magma code without having magma via a webform at
USydney (I doubt that API will change any time soon, since I
personally wrote that API for the Magma group about 5 years ago).

    sloane_find -- look up integer sequences in sloane's tables

    finance.Stock -- gets historical stock data, current price, etc.

The sloane sequence lookup code did have to get updated once.
The finance.Stock code once didn't work from me when I had 25 students
all using it for 2 hours straight, all going through a single notebook
server -- it looked like "over usage" to google.  We added a fallback
feature where if the data can't be got by Google for whatever reason
(e.g., throttling), then the exact same function falls back to getting
data from Yahoo instead.

William

> Another possibility, which could avoid some of these issues, it to
> download the data to the Sage web site and get Sage to fetch the data
> from the Sage web site, not from http://www.xe.com/ NPL, or anywhere
> else like that. Then, if the format of the data changes, updates need to
> be applied to the Sage web site, not to every version of Sage in
> existence. That is what Mathematica does - when a currency conversion is
> looked up, data is collected from Wolfram.com.
>
> Then you can get issues of with companies block an IP address, if one IP
> keeps interrogating the data. I used to get fed up with late-running
> trains on my local line and wanted to prove the number of trains running
> late were higher than the advertised figures.
>
> So I wrote a web parser to get data on the current running times of
> trains on my local line. Then I could find out exactly how late each
> train was. That was until my IP address got banned.
>
> Without a stable API, the web based lookup might be more hassle than it
> is worth.

These days there are stable API for this sort of thing.

> You might not agree with all or any of the above, but I hope it
> highlights some possible issues.
>
> Dave
>
> >
>



-- 
William Stein
Associate Professor of Mathematics
University of Washington
http://wstein.org

--~--~---------~--~----~------------~-------~--~----~
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~----------~----~----~----~------~----~------~--~---

Reply via email to