Hi Ben,
I thought about this but the table would end up with millions of rows of data
and the only useful index would have at least 700 duplicates and I am not
sure how efficient that would be. It would also introduce another data
problem in that there will have to be what is essentially an "autonumber"
going on for each of the 700 tables in one column. This is not a true
autonumber but can be done with the autonumber command if each is in it's
own table. The number in this column must begin with 1 and be sequential
based on the index column mentioned above. Then this column will have an
index built on it which will give another 700 duplicates in the index. Most of
the work done on the table will involve this "autonumber" column.
Speed is going to be important. Every day there are around 120 complex
calculations made for each table with the results being stored in the table. The
report analysis is a whole other very complex operation.
I guess at this point I could ask what is the maximum number of rows that a
table can accept and at what point does it become too unwieldy.?
Best,
Mike Young
On Tue, 25 Sep 2001 23:30:44 +0100, Ben Petersen wrote:
>Mike,
>
>If these tables have identical structures, couldn't you add one more
>integer column to ID which ..data set.. this belongs to (1-700) and
>manage from there? Just a thought.
>
>Ben Petersen
>
>
>
>On 25 Sep 2001, at 23:00, Michael Young wrote:
>
>> Hi Troy,
>>
>> I might be better off putting it into a few R:Base databases than DBASE
files,
>> but only testing will tell. I thought the same thing about this and I have
>> wracked my brain to get around this many tables but it appears there is
none.
>> The only way to cut down the number of tables is if we could create a 3
>> dimensional table and I don't think that is slated for the next version of
>> R:Base <G>.
>>
>> Best regards,
>> Mike Young
>>
>> On Tue, 25 Sep 2001 21:34:33 -0600, Troy Sosamon wrote:
>>
>> >Speed wise using any foreign data source is a lot slower than native
R:base,
>> >even using the sconnect between R:base database is slow.
>> >
>> >700 tables w/ 130 col in each. I think the relational database police will
>> >come hunt you down???
>> >
>> >Troy
>> >
>> >>===== Original Message From [EMAIL PROTECTED] =====
>> >>Thanks Manuel and Castanaro,
>> >>
>> >>Looks like I will probably end up going the DBASE route.
>> >>
>> >>Best regards,
>> >>Mike Young
>> >>
>> >>On Tue, 25 Sep 2001 17:21:39 -0700, Manuel de Aguiar wrote:
>> >>
>> >>>Hello Mike,
>> >>>----------------------------------------------------------------------------
>> >-----
>> >>>Tables/Views/Columns/Rows
>> >>>
>> >>>Create up to 16,000 tables or views
>> >>>Create up to 16,000 columns
>> >>>Create up to 400 columns (fields) per table or view
>> >>>Maximum row (record) lengths of 4096 bytes
>> >>>----------------------------------------------------------------------------
>> >--------
>> >>>I found this information at:
>> >>> http://rbase2000.com/Products/rbase.htm
>> >>>Hope this helps,
>> >>>Manuel
>> >>>
>> >>>Michael Young wrote:
>> >>>
>> >>>> Hello,
>> >>>>
>> >>>> Does anybody know the limits in terms of number of tables and
>> columns
>> >>that
>> >>>> are allowed in RBWIN 6.5++?
>> >>>>
>> >>>> I am looking at developing a database with approximately 700
tables
>> and
>> >>around
>> >>>> 130 columns in each table.
>> >>>>
>> >>>> I realize that I may need to use multiple databases but I have
another
>> >>option and
>> >>>> that would be to create DBASE tables since there really is no
>> relationship
>> >>>> between most of the tables and I can just attach them as needed.
Does
>> >>anyone
>> >>>> know the speed or other drawbacks to keeping data in a DBASE
file?
>> >>>>
>> >>>> Thanks,
>> >>>> Mike Young
>> >>>
>> >
>> >Troy Sosamon
>> >Denver Co
>> >[EMAIL PROTECTED]
>> >
>>
>>
>>
>>
>
>