You can write a class to import this via ODBC from a staging database.  This will ensure all of the RI is correct.  I have written classes for all of our data imports, regardless of the size of the dataset.

 

Michael G. Simms Senior Developer
Vita-Mix Corporation

MCP, CBIP
8615 Usher Road
Cleveland, Ohio 44138-2103
Office: (440) 782-2607
Cell:  (330) 307-4971
www.vitamix.com -
[EMAIL PROTECTED]
 
The information contained in this message may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please notify us immediately by replying to the message and deleting it from your computer.

 


From: Hemant Kumthekar [mailto:[EMAIL PROTECTED]
Sent: Friday, February 11, 2005 11:08 AM
To: Axapta-Knowledge-Village@yahoogroups.com
Subject: Re: [Axapta-Knowledge-Village] Re: Very large item master

 

Hi,

 

To upload Item Master, BOM , Routings with large amount of data, MS-SQL DTS is better option, other way is to create a CSV files. We had the same problem in one of our implementation where we had around 700K records in BoM it took us 5 hours to upload using CSV and the upload was done on directly on Server, though we used combination of Access/Excel and converting to Text (CSV) files , but I think using MS-SQL DTS is better option anytime. We seriously thought about writting a tool to pump the data, but since our implementation deadline was tight we thought of not doing that, but now we are re-thinking about it.

 

HK

Subrahmanyam Mamidi <[EMAIL PROTECTED]> wrote:

Hey sorry for my third mail on this. something else came into my mind......Actually I didn't have access to SQL database as my administrator did n't give me the access ( kidding ! ). Did you try to load directly in SQL using DTS?. how does sound????. I guess its a worth to give a try..................

 

Thanks,

Subbu

SDBC

Subrahmanyam Mamidi <[EMAIL PROTECTED]> wrote:

Just adding to my last email, at the end of the load, you need to run consistency check to make sure your data is valid.................

Subrahmanyam Mamidi <[EMAIL PROTECTED]> wrote:

Hi,

 

Presently I am migrating master data into Axapta.

 

1. Axapta is relatively slow for Excel as it uses COM for communication. It took approximately 2 hours to upload ~ 9000 records in just only one Table ( invnetTable).

 

So I wrote jobs that read from the flat file and Individually inserts into the Tables. And for every records in InventTable, there would a record in InventDim and InventItemLocation. By default Axapta takes 'Axapta' as the InventDimId. And then you need to map that in IntemLocation..if you have wares houses , then you need to InventDim::findorCreate to generate the inventId and attch it to the InventItemLocation.or you will not see them in the ItemMaster form.....

In the same way, I wrote jobs for ConfigTable for COnfigurations, reqItemTable for Item coverages and InventDimCombination for Cost prices. And then wares houses and locations blah blah...Its really pain...but If you can write jobs, that would be easier I feel...We use conversion studio..But I feel it handles multiple formats. but when it comes the data volume, it takes almost same time..

 

Please let me know fi you need anything as I have already written scripts...

 

Thansk & Regards,

Subbu

SDBC implementaion.

Danny Gaethofs <[EMAIL PROTECTED]> wrote:


Dear Lourens,

The only way to get this done is by importing the data directly into
Axapta from a flat file, CSV format.


You can not sustain with only importing into the InventTable.
You will have to populate these tables also:
* InventTableModule
* InventItemLocation
* InventDim:

regards,
Danny


--- In Axapta-Knowledge-Village@yahoogroups.com, "lourens1210"
<[EMAIL PROTECTED]> wrote:
>
> Hi all
>
> We will soon be working at a client with an item master in excess
of
> 1.9 million items. The migration of these items to Axapta seems a
> daunting task considering the following:
>
> Excell can handle 65,536 lines of data in a sheet which means if
one
> uses Axapta's excel spreadsheet templates you will need about 30
> templates. This will take about 8 full days to upload onto Axapta.
>
> Besides these logistical hassles the process of cleaning up the
data
> before import will also be a messy one to say the least.
>
> Does anybody have any suggestions how to approach this.
>
> Lourens





Sharing the knowledge on Axapta.

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com

Sharing the knowledge on Axapta.


Do you Yahoo!?
Yahoo! Search presents - Jib Jab's 'Second Term'

Sharing the knowledge on Axapta.

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com

Sharing the knowledge on Axapta.


Do you Yahoo!?
Meet the all-new My Yahoo! – Try it today!

Sharing the knowledge on Axapta.





Sharing the knowledge on Axapta.


Yahoo! Groups Sponsor
ADVERTISEMENT


Yahoo! Groups Links

Reply via email to