Paul --
Light dawns over marble head.
Now I get it. Shorter arrays, yes. Total size of data base is not the
issue.
Thank you very much.
-- Keith
On 2/25/2010 01:42, paultsho wrote:
Keith
After you've divided your data into chunks. You can import them all
into a single database or multiple
Keith
After you've divided your data into chunks. You can import them all into a
single database or multiple databases as you wish. If you choose to import them
all to a single db, and there are advantages with that, the performance
improvement will be significant because, the size of the array
Keith,
Thanks very much for a such detailed and quick reply!
I have been using TradeStation for years. I will use it to update my intraday
database of AB.
How to create/manage database of AB will be my next job.
currently, I have problem to unzip those zipped files of PITrading. I have a
wi
Charles --
There are some minor problems:
First, not all data sources time stamp their data the same. Some
sources, PItrading, WealthLab, and TradeStation, to name a few, stamp
with the Closing time of the bar, whereas some others,
InteractiveBrokers and E-signal stamp with the Opening time of
Paul --
Sorry, somehow I missed your response over a month ago. Thank you for
pointing out that I may be able to do what I need in .afl. You are
probably correct. And it has been quite awhile since I last programmed
in C++ and was not looking forward to the re-learning process.
I'm not qui
Keith,
I placed an order to buy those CDs. Do you have any problem to read data from
them into amribroker? Is the quality of data good?
Thanks.
Charles
--- In amibroker@yahoogroups.com, Keith McCombs wrote:
>
> Mike, gariki, and James --
> I have stock data from PItrading.com. 1100 stocks, m
Hello Keith
Partitioning data into smaller periods consiste of 2 steps
1. exporting data from ... to a certain period. There are some afl lying
around that would do the exporting to an ascii file. all you need is an extra
if statement to export data only if it is within your desired data range a
Longstrangest --
Thank you for your reply.
I'm not about to attack SQL in the near future. However, its nice that
I am not all alone with this problem.
And your post below gives me some encouragement to try to attack the
problem. I also like your suggestion for reducing AB overhead by making
A technique I've been using to deal with large intraday historical DB's is to
store the data in a SQL database, use the ODBC plugin (with some custom mods)
and use a SQL stored procedure to fetch the data and deliver the bars to AB.
Then you can control how much data AB "sees" (has access to) b
Mike, gariki, and James --
I have stock data from PItrading.com. 1100 stocks, most going back 7 years.
AB uses 40bytes to store one data point, OHLCV. That means for 1min of
data 15,600 bytes per day, 3,931,200 bytes per year. For 1100 stocks
that is 4,324,320,000 bytes, 4GB for one year. I'
, January 15, 2010 5:58:44 PM
Subject: [amibroker] Re: Using large intra day historical data bases
I have done exactly this recently; but with only about 5years of intraday 1min
data. In the end once the 5year database got loaded (the initial database
creation took a while) i started using that
I have done exactly this recently; but with only about 5years of intraday 1min
data. In the end once the 5year database got loaded (the initial database
creation took a while) i started using that database instead of the five 1year
databases and used walkforward settings to set the backtest peri
Kieth,
I will admit that I have not used it. But, wouldn't a single database using
QuickAFL solve the problem for you?
Mike
--- In amibroker@yahoogroups.com, Keith McCombs wrote:
>
> I recently purchased 1min historical data, in ASCII format, for system
> development and backtesting. Some of
13 matches
Mail list logo