[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread Massimo Di Pierro
P.S. Usually financial companies like to use PyTables for this purpose.

On Wednesday, 9 April 2014 18:03:58 UTC-5, Massimo Di Pierro wrote:
>
> It is simply a key/value store built on top of SQLite.
> Would you be able to share any example code about accessing Bloomberg Data 
> Terminal and ICE?
>
> Massimo
>
> On Wednesday, 9 April 2014 16:15:36 UTC-5, Trent Telfer wrote:
>>
>> I am actually pulling the data from a Bloomberg Data Terminal, Bank of 
>> Canada, ICE and NGX. All have various ways to access the data. Some 
>> providers have fairly nice API's (bloomberg) and others like NGX only 
>> provide Excel files.
>>
>> I'm interested in this persistent dictionary concept and am curious if it 
>> at the least can be adapted to be used with the Bloomberg terminal data.
>>
>> -Trent
>>
>> On Wednesday, April 9, 2014 3:08:28 PM UTC-6, Massimo Di Pierro wrote:
>>>
>>> Are you getting from Yahoo finance?
>>> Look into github.com/mdipierro/nlib
>>>
>>> from nlib import *
>>> symbol = 'AAPL'
>>> d = PersistentDictionary()
>>> if symbol in d:
>>>h = d[symbol]
>>> else
>>>h = d[symbol] = YStock(symbol).historical()
>>> print d[0].adjusted_close
>>>
>>> PersistentDictionary() is like shelve but uses sqlite and therefore is 
>>> thread safe.
>>>
>>>
>>>
>>>
>>> On Wednesday, 9 April 2014 10:29:14 UTC-5, Trent Telfer wrote:

 Brian M,

 Thanks for the reply. I am looking into doing the following with 
 historical market information

 a) Base Table

 ID,Exchange,Description,Base Currency
 Char(30),Char(10),Varchar(256),Char(5)

 b) Market Data

 ID,Date,HighPrice,LowPrice,OpenPrice,ClosePrice,Volume
 Char(30),Date,Float,Float,Float,Float,Long

 I am unsure if that table setup is the best choice or if there is a 
 better way to approach it?

 I've also been wondering if I should jump into the world of NoSQL 
 (specifically cassandra) as I may need to have smaller pricing intervals 
 than days in the future.

 -Trent


 On Tuesday, April 8, 2014 6:48:50 PM UTC-6, Brian M wrote:
>
> Assuming each source needs the same data fields, how about just using 
> one table and including an extra field to specify which source each 
> record 
> came from?
> Or if you really want a separate table for each timeseries, you could 
> look into using table inheritance. 
> http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#Table-inheritance
>
> A little more about how you are planning to use or display the data 
> might help. 38 separate tables, one table with 38 columns or rows? As a 
> graph with each series being a line?
>
> ~Brian
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread Massimo Di Pierro
It is simply a key/value store built on top of SQLite.
Would you be able to share any example code about accessing Bloomberg Data 
Terminal and ICE?

Massimo

On Wednesday, 9 April 2014 16:15:36 UTC-5, Trent Telfer wrote:
>
> I am actually pulling the data from a Bloomberg Data Terminal, Bank of 
> Canada, ICE and NGX. All have various ways to access the data. Some 
> providers have fairly nice API's (bloomberg) and others like NGX only 
> provide Excel files.
>
> I'm interested in this persistent dictionary concept and am curious if it 
> at the least can be adapted to be used with the Bloomberg terminal data.
>
> -Trent
>
> On Wednesday, April 9, 2014 3:08:28 PM UTC-6, Massimo Di Pierro wrote:
>>
>> Are you getting from Yahoo finance?
>> Look into github.com/mdipierro/nlib
>>
>> from nlib import *
>> symbol = 'AAPL'
>> d = PersistentDictionary()
>> if symbol in d:
>>h = d[symbol]
>> else
>>h = d[symbol] = YStock(symbol).historical()
>> print d[0].adjusted_close
>>
>> PersistentDictionary() is like shelve but uses sqlite and therefore is 
>> thread safe.
>>
>>
>>
>>
>> On Wednesday, 9 April 2014 10:29:14 UTC-5, Trent Telfer wrote:
>>>
>>> Brian M,
>>>
>>> Thanks for the reply. I am looking into doing the following with 
>>> historical market information
>>>
>>> a) Base Table
>>>
>>> ID,Exchange,Description,Base Currency
>>> Char(30),Char(10),Varchar(256),Char(5)
>>>
>>> b) Market Data
>>>
>>> ID,Date,HighPrice,LowPrice,OpenPrice,ClosePrice,Volume
>>> Char(30),Date,Float,Float,Float,Float,Long
>>>
>>> I am unsure if that table setup is the best choice or if there is a 
>>> better way to approach it?
>>>
>>> I've also been wondering if I should jump into the world of NoSQL 
>>> (specifically cassandra) as I may need to have smaller pricing intervals 
>>> than days in the future.
>>>
>>> -Trent
>>>
>>>
>>> On Tuesday, April 8, 2014 6:48:50 PM UTC-6, Brian M wrote:

 Assuming each source needs the same data fields, how about just using 
 one table and including an extra field to specify which source each record 
 came from?
 Or if you really want a separate table for each timeseries, you could 
 look into using table inheritance. 
 http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#Table-inheritance

 A little more about how you are planning to use or display the data 
 might help. 38 separate tables, one table with 38 columns or rows? As a 
 graph with each series being a line?

 ~Brian



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread Trent Telfer
I am actually pulling the data from a Bloomberg Data Terminal, Bank of 
Canada, ICE and NGX. All have various ways to access the data. Some 
providers have fairly nice API's (bloomberg) and others like NGX only 
provide Excel files.

I'm interested in this persistent dictionary concept and am curious if it 
at the least can be adapted to be used with the Bloomberg terminal data.

-Trent

On Wednesday, April 9, 2014 3:08:28 PM UTC-6, Massimo Di Pierro wrote:
>
> Are you getting from Yahoo finance?
> Look into github.com/mdipierro/nlib
>
> from nlib import *
> symbol = 'AAPL'
> d = PersistentDictionary()
> if symbol in d:
>h = d[symbol]
> else
>h = d[symbol] = YStock(symbol).historical()
> print d[0].adjusted_close
>
> PersistentDictionary() is like shelve but uses sqlite and therefore is 
> thread safe.
>
>
>
>
> On Wednesday, 9 April 2014 10:29:14 UTC-5, Trent Telfer wrote:
>>
>> Brian M,
>>
>> Thanks for the reply. I am looking into doing the following with 
>> historical market information
>>
>> a) Base Table
>>
>> ID,Exchange,Description,Base Currency
>> Char(30),Char(10),Varchar(256),Char(5)
>>
>> b) Market Data
>>
>> ID,Date,HighPrice,LowPrice,OpenPrice,ClosePrice,Volume
>> Char(30),Date,Float,Float,Float,Float,Long
>>
>> I am unsure if that table setup is the best choice or if there is a 
>> better way to approach it?
>>
>> I've also been wondering if I should jump into the world of NoSQL 
>> (specifically cassandra) as I may need to have smaller pricing intervals 
>> than days in the future.
>>
>> -Trent
>>
>>
>> On Tuesday, April 8, 2014 6:48:50 PM UTC-6, Brian M wrote:
>>>
>>> Assuming each source needs the same data fields, how about just using 
>>> one table and including an extra field to specify which source each record 
>>> came from?
>>> Or if you really want a separate table for each timeseries, you could 
>>> look into using table inheritance. 
>>> http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#Table-inheritance
>>>
>>> A little more about how you are planning to use or display the data 
>>> might help. 38 separate tables, one table with 38 columns or rows? As a 
>>> graph with each series being a line?
>>>
>>> ~Brian
>>>
>>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread Massimo Di Pierro
Are you getting from Yahoo finance?
Look into github.com/mdipierro/nlib

from nlib import *
symbol = 'AAPL'
d = PersistentDictionary()
if symbol in d:
   h = d[symbol]
else
   h = d[symbol] = YStock(symbol).historical()
print d[0].adjusted_close

PersistentDictionary() is like shelve but uses sqlite and therefore is 
thread safe.




On Wednesday, 9 April 2014 10:29:14 UTC-5, Trent Telfer wrote:
>
> Brian M,
>
> Thanks for the reply. I am looking into doing the following with 
> historical market information
>
> a) Base Table
>
> ID,Exchange,Description,Base Currency
> Char(30),Char(10),Varchar(256),Char(5)
>
> b) Market Data
>
> ID,Date,HighPrice,LowPrice,OpenPrice,ClosePrice,Volume
> Char(30),Date,Float,Float,Float,Float,Long
>
> I am unsure if that table setup is the best choice or if there is a better 
> way to approach it?
>
> I've also been wondering if I should jump into the world of NoSQL 
> (specifically cassandra) as I may need to have smaller pricing intervals 
> than days in the future.
>
> -Trent
>
>
> On Tuesday, April 8, 2014 6:48:50 PM UTC-6, Brian M wrote:
>>
>> Assuming each source needs the same data fields, how about just using one 
>> table and including an extra field to specify which source each record came 
>> from?
>> Or if you really want a separate table for each timeseries, you could 
>> look into using table inheritance. 
>> http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#Table-inheritance
>>
>> A little more about how you are planning to use or display the data might 
>> help. 38 separate tables, one table with 38 columns or rows? As a graph 
>> with each series being a line?
>>
>> ~Brian
>>
>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread thehuman trashcan
Hi,

I am totally new to programming, but in a bid to improve, I am trying to 
answer any question I reckon I can add value on..so...!.. 

I would probably define two tables:

*Sources data*
This table details each of your 38 sources and gives them an ID, you can 
give them a name, detail the URL you are using, description, etc.

*Time Series data*
This tables would have 4 columns,
ID (unique ref)
reference ID - this links to the previous table
Date
Amount

This way you can put all the data in this table, and minimise any duplicate 
info.  Maybe put an index in place - not too sure how these work - but I 
think they help speed searching, this might be needed if you have lots of 
values.

Your dashboard can then pull info from the time series data to build the 
graphs yet pull names, source info from the sources data table.

Let me know what you think!


On Tuesday, 8 April 2014 00:53:20 UTC+2, Trent Telfer wrote:
>
> I am attempting to build a small webpage that takes some pricing data from 
> a few external sources and displays it on one concise page (a dashboard of 
> sorts). My problem is I have 38 timeseries to input in the database and I 
> am hoping someone here can suggest a way around writing multiple 
> define_tables? All data is in the form of dates with one data point, but 
> they don't necessarily all start at the same time.
>
> Thanks,
>
> Trent
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Best approach to using the DAL with external data sources that will go into multiple tables?

2014-04-09 Thread Trent Telfer
Brian M,

Thanks for the reply. I am looking into doing the following with historical 
market information

a) Base Table

ID,Exchange,Description,Base Currency
Char(30),Char(10),Varchar(256),Char(5)

b) Market Data

ID,Date,HighPrice,LowPrice,OpenPrice,ClosePrice,Volume
Char(30),Date,Float,Float,Float,Float,Long

I am unsure if that table setup is the best choice or if there is a better 
way to approach it?

I've also been wondering if I should jump into the world of NoSQL 
(specifically cassandra) as I may need to have smaller pricing intervals 
than days in the future.

-Trent


On Tuesday, April 8, 2014 6:48:50 PM UTC-6, Brian M wrote:
>
> Assuming each source needs the same data fields, how about just using one 
> table and including an extra field to specify which source each record came 
> from?
> Or if you really want a separate table for each timeseries, you could look 
> into using table inheritance. 
> http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#Table-inheritance
>
> A little more about how you are planning to use or display the data might 
> help. 38 separate tables, one table with 38 columns or rows? As a graph 
> with each series being a line?
>
> ~Brian
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.