Hi everyone,
I'm trying to create a workflow for processing data from next generation
sequencing data. So i have big files.
Input ~ 15 GB.
Basically I'm trying to use a beanshell script that looks like the
following:
String command = "java -Xms256m -Xmx512m -jar /path/to/Programm.jar";
Process proc = null;
Runtime rt = Runtime.getRuntime();
proc = rt.exec(command
+ input1
+ input2
+ input3
+ output);
int exitVal = proc.waitFor();
When running the workflow I get the following error:
|
java.lang.OutOfMemoryError: Requested array size exceeds VM limit|
My linux machine has 16 gb of memory and I have already tried to give taverna
more memory at the start.
On gnome-terminal the command runs without any errors. Are there any
suggestions?
I'm really thankful for any help on this topic.
Best Regards,
Julian
------------------------------------------------------------------------------
Shape the Mobile Experience: Free Subscription
Software experts and developers: Be at the forefront of tech innovation.
Intel(R) Software Adrenaline delivers strategic insight and game-changing
conversations that shape the rapidly evolving mobile landscape. Sign up now.
http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk
_______________________________________________
taverna-users mailing list
[email protected]
[email protected]
Web site: http://www.taverna.org.uk
Mailing lists: http://www.taverna.org.uk/about/contact-us/