I have tried using this with SXSSF as following:
ArrayList< File> fileList = new ArrayList<File>();
File file1 = new File("E:/temp/f1.xlsx");
File file2 = new File("E:/temp/f2.xlsx");
File file3 = new File("E:/temp/sxssf3.xlsx");
fileList.add(file1);
fileList.add(file2);
int lastRowProcessed = 0;
for (Iterator iterator = fileList.iterator();
iterator.hasNext();) {
File filename = (File) iterator.next();
OPCPackage pkg = OPCPackage.open(new
FileInputStream(filename.getAbsolutePath()));
XSSFWorkbook xssfwb = new XSSFWorkbook(pkg);
Workbook wb = new SXSSFWorkbook(xssfwb,100); // keep
100 rows in
memory, exceeding rows will be flushed to disk
Sheet sh = wb.createSheet();
int startCount = lastRowProcessed;
for(int rownum = startCount; rownum < 10+startCount;
rownum++){
Row row = sh.createRow(rownum);
for(int cellnum = 0; cellnum < 10; cellnum++){
Cell cell = row.createCell(cellnum);
String address = new
CellReference(cell).formatAsString();
cell.setCellValue("StringTest"+cellnum);
}
lastRowProcessed =rownum;
}
java.io.FileOutputStream out = new
java.io.FileOutputStream(file3);
wb.write(out);
out.close();
}
But the problem is wb.write(out); the final output of this code is data of
File 2 is present in sxssf3.xlsx
May be my approach is wrong. I want to merge file f1.xlsx and f2.xlsx to get
File sxssf3.xlsx.
But how can i maintain data for all the files to overcome memory issue?
--
View this message in context:
http://apache-poi.1045710.n5.nabble.com/Approach-to-overcome-Xssf-performance-problem-tp5031686p5031756.html
Sent from the POI - User mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]