Hi David,

> We have been using DataExtract to save our data and then import it again.
> Unfortunately now that we have gone up to 1 million records 
> in a table, it takes 45 mins to do the import!
> I know there are various ways to do this - what is the **quickest** and most 
> efficient?  
> What kind of speed can I expect in this case?

the answer depends on several preconditions and circumstances. 
here are the most important ones:

if you like your extracted data in a *readable* format you have to choose the
*compressed* or *formatted* format. if your tabledefinition doesn't contain *blob* 
columns
you can use the *fastload* command to load the data. 
http://www.sapdb.org/7.4/htmhelp/ec/d93fb6400d11d3aa27006094b92fad/frameset.htm
*fastload* isn't supported on tables with *blob* columns.

otherwise you can use the **quickest** internal format *pages* to transport your data
http://www.sapdb.org/7.4/htmhelp/ec/d93fd1400d11d3aa27006094b92fad/frameset.htm


-- 
Hans-Georg Bumes
SAP DB, SAP Labs Berlin
http://www.sapdb.org/


_______________________________________________
sapdb.general mailing list
[EMAIL PROTECTED]
http://listserv.sap.com/mailman/listinfo/sapdb.general

Reply via email to