>> ItemA, ItemB  etc. is the actual item part number.   
But will they always code this the same or could the item be A-100 or A100 or A 
100 depending
on their mood?  

Marc



From: [email protected] 
Sent: Tuesday, January 12, 2010 8:57 AM
To: RBASE-L Mailing List 
Subject: [RBASE-L] - Re: Large Spread sheet matrix data


I guess my question is more geared towards a longer term, "no intervention 
required" solution.



I.E.  I can manually import the spread sheet using ODBC or Gateway etc. as you 
mention.

However, since the number of columns could change, I am trying to come up with 
a solution

that would handle that situation automatically.   Any apps that have columns 
hard coded would have to be

changed each time an item# (i.e. column) was added or removed.  I am attempting 
to come

up with a system that would allow the customer to send these spread sheets in 
on a regular

basis and have Rbase seamlessly integrate regardless of the number of columns 
without me

having to modify the program each time.



The spread sheet has a header row..

Store#    ItemA   ItemB  ItemC  etc.    

where ItemA, ItemB  etc. is the actual item part number.   



So the system would need to read the spread sheet header row to determine what 
each column

is. (As these could change)   Then load the data rows in accordingly.  Not an 
easy task, but 

perhaps possible.  So that is the first question..... is it feasible to load a 
changing matrix 

automatically?



The second question, is data design.    Is a simple table (Store#, Item#) with 
300,000 plus rows

the best design?  Of the 300,000 rows, there would be only 1950 unique store 
numbers and only

157 unique item numbers?



Thanks-









----- Original Message -----
From: "James Bentley" <[email protected]>
To: "RBASE-L Mailing List" <[email protected]>
Sent: Tuesday, January 12, 2010 8:36:02 AM GMT -06:00 US/Canada Central
Subject: [RBASE-L] - Re: Large Spread sheet matrix data



Bob,

You might create an ODBC connection to the spread sheet. I believe there is a 
Microsoft item to do that.  If not unload the spread sheet as a CSV and use the 
Microsoft CSV/TEXT ODBC connector.
you would use the ODBC connection to define a 158 column table Store Number and 
the 157 items.

You then can define various views to access the data.  The UNION statement 
could be effectively used to get transform the column data to row data by store.


Jim Bentley
American Celiac Society
[email protected]
tel: 1-504-737-3293 





--------------------------------------------------------------------------------
From: "[email protected]" <[email protected]>
To: RBASE-L Mailing List <[email protected]>
Sent: Tue, January 12, 2010 8:01:08 AM
Subject: [RBASE-L] - Large Spread sheet matrix data



I have an application where a supplied data source is an Excel spreadsheet.

This spread sheet is a matrix consisting of locations as rows and stocked items

as columns.       (I have no control over the spread sheet design!)



Store#   ItemA   ItemB   ItemC ItemD.....

0100      Y         N            N         Y

0200      Y         Y            N         N

0300      N         Y            Y         N

0400      N         N            N        Y



etc.





This spread sheet is updated regularly, with stores added or removed (# of 
rows), stock

items added or removed (# of columns)  and the store information changed. (Y/N)



Currently there are 1950 rows in the spread sheet and 157 items (columns).



I need to link Rbase to this spreadsheet and be able to run routines against 
store# and

what items each store stocks, etc.  Such information such as # of stores that 
ItemA is stocked

in, how many items Store 0100 has. etc. etc.



Any thoughts on an efficient method as to link this continually changing data 
into

Rbase table(s).  ?   A simple table with Store Number and Item number would

have 306150 rows of data. (1950 x 157) and would also be difficult to populate

from the above matrix.



Innovative ideas appreciated!



Thanks-

Bob




 


Reply via email to