Hello - what is a large amount of memory, how do you determine it (make sure 
you look at RES, not VIRT) and what are your JVM settings.

It is not uncommon for programs to allocate much memory if the default max heap 
is used, 2 GB in my case. If your JVM eats too much, limit it by setting Xmx to 
a lower level.

Markus
 
-----Original message-----
> From:Will Jones <systemdotf...@gmail.com>
> Sent: Tuesday 3rd January 2017 18:14
> To: user@tika.apache.org
> Subject: Memory issues with the Tika Facade
> 
> Hi, 
> 
> Big fan of what you are doing with Apache Tika. I have been using the Tika 
> facade to fetch metadata on each file in a directory containing a large 
> number of files.  
> 
> It returns the data I need, but the running process very quickly consumes a 
> large amount of memory as it proceeds through the files. 
> 
> What am I doing wrong? I have attached the code required to reproduce my 
> problem below. 
> 
> 
> public class TikaTest {
> 
>     public void tikaProcess(Path filePath) {
>         Tika t = new Tika();
>         try {
>             Metadata metadata = new Metadata();
> 
>             String result = t.parse(filePath, metadata).toString();
>         }catch (Exception e){
>             e.printStackTrace();
>         }
>     }
> 
>     public static void main(String[] args) {
>         TikaTest tt = new TikaTest();
>         try {
>             Files.list(Paths.get("g:/somedata/")).forEach(
>                     path -> tt.tikaProcess(path)
>             );
>         }catch (Exception e) {
>             e.printStackTrace();
>         }
>     }
> }

Reply via email to