Where did you read anything about a 2G heap being “in the danger zone”? I
routinely see heap sizes in the 16G range and greater. The default 512M is
actually _much_ lower than it probably should be, see:
https://issues.apache.org/jira/browse/SOLR-13446
The “danger” if you allocate too much memo
I think you can safely increase heap size to 1 gb or what you need.
Be aware though:
Solrs performance depends heavily on file system caches which are not on the
heap! So you need more memory than what you configure as heap freely available.
How much more depends on your index size.
Another opt
I am using SOLR version 6.6.0 and the heap size is set to 512 MB, I believe
which is default. We do have almost 10 million documents in the index, we do
perform frequent updates (we are doing soft commit on every update: heap issue
was seen with and without soft commit) to the index and obviousl
java.lang.OutOfMemoryError: Java heap space
Hi,
I am trying to retrieve all the documents from a solr index in a batched manner.
I have 100M documents. I am retrieving them using the method proposed here
https://nowontap.wordpress.com/2014/04/04/solr-exporting-an-index-to-an-external-file/
I am
ay/solr/Pagination+of+Results
> > M.
> >
> >
> >
> > -Original message-
> > > From:Ajinkya Kale
> > > Sent: Monday 28th September 2015 20:46
> > > To: solr-user@lucene.apache.org; java-u...@lucene.apache.org
> > > Subject: Solr ja
t; -Original message-
> > From:Ajinkya Kale
> > Sent: Monday 28th September 2015 20:46
> > To: solr-user@lucene.apache.org; java-u...@lucene.apache.org
> > Subject: Solr java.lang.OutOfMemoryError: Java heap space
> >
> > Hi,
> >
> > I am trying to ret
pache.org
> Subject: Solr java.lang.OutOfMemoryError: Java heap space
>
> Hi,
>
> I am trying to retrieve all the documents from a solr index in a batched
> manner.
> I have 100M documents. I am retrieving them using the method proposed here
> https://nowontap.wordpress.com/2014
get "OutOfMemoryError" if
start is at 50M. I get the same error even if rows=10 for start=50M.
Curl on start=0 rows=50M in one go works fine too. But things go bad when
start is at 50M.
My Solr version is 4.4.0.
Caused by: java.lang.OutOfMemoryError: Java hea
ang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2367)
at
java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
at
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.
Steve Rowe [sar...@gmail.com] wrote:
> 1 Lakh (aka Lac) = 10^5 is written as 1,00,000
>
> It’s used in Bangladesh, India, Myanmar, Nepal, Pakistan, and Sri Lanka,
> roughly 1/4 of the world’s population.
Yet still it causes confusion and distracts from the issue. Let's just stick to
metric, okay
On Jul 25, 2014, at 9:13 AM, Shawn Heisey wrote:
> On 7/24/2014 7:53 AM, Ameya Aware wrote:
> The odd location of the commas in the start of this thread make it hard
> to understand exactly what numbers you were trying to say
On Jul 24, 2014, at 9:32 AM, Ameya Aware wrote:
> I am in process o
On 7/24/2014 7:53 AM, Ameya Aware wrote:
> I did not make any other change than this.. rest of the settings are
> default.
>
> Do i need to set garbage collection strategy?
The collector chosen and its and tuning params can have a massive impact
on performance, but it will make no difference at a
n, thats why i why getting java heap space error?
On Thu, Jul 24, 2014 at 9:58 AM, Marcello Lorenzi
mailto:mlore...@sorint.it>>
wrote:
I think that on large heap is suggested to monitor the garbage collection
behavior and try to add a strategy adapted to your performance. On my
prod
ts why i why getting java heap space error?
>
>
>
>
>
> On Thu, Jul 24, 2014 at 9:58 AM, Marcello Lorenzi
> wrote:
>
>> I think that on large heap is suggested to monitor the garbage collection
>> behavior and try to add a strategy adapted to your performance.
ooh ok.
So you want to say that since i am using large heap but didnt set my
garbage collection, thats why i why getting java heap space error?
On Thu, Jul 24, 2014 at 9:58 AM, Marcello Lorenzi
wrote:
> I think that on large heap is suggested to monitor the garbage collection
> be
ives java heap
space error
again.
Any fix for this?
Thanks,
Ameya
/2014 03:32 PM, Ameya Aware wrote:
>
>> Hi
>>
>> I am in process of indexing around 2,00,000 documents.
>>
>> I have increase java jeap space to 4 GB using below command :
>>
>> java -Xmx4096M -Xms4096M -jar start.jar
>>
>> Still after indexin
indexing around 15000 documents it gives java heap space error
again.
Any fix for this?
Thanks,
Ameya
Hi
I am in process of indexing around 2,00,000 documents.
I have increase java jeap space to 4 GB using below command :
java -Xmx4096M -Xms4096M -jar start.jar
Still after indexing around 15000 documents it gives java heap space error
again.
Any fix for this?
Thanks,
Ameya
mat you will see much clearer what is the likely cause.
Harald.
On 22.07.2014 19:37, Ameya Aware wrote:
Hi
i am running into java heap space issue. Please see below log.
ERROR - 2014-07-22 11:38:59.370; org.apache.solr.common.SolrException;
null:java.lang.RuntimeException
> So can i come over this exception by increasing heap size somewhere?
> Thanks,
> Ameya
> On Tue, Jul 22, 2014 at 2:00 PM, Shawn Heisey wrote:
>> On 7/22/2014 11:37 AM, Ameya Aware wrote:
>> > i am running into java heap space issue. Please see below log.
>>
&g
So can i come over this exception by increasing heap size somewhere?
Thanks,
Ameya
On Tue, Jul 22, 2014 at 2:00 PM, Shawn Heisey wrote:
> On 7/22/2014 11:37 AM, Ameya Aware wrote:
> > i am running into java heap space issue. Please see below log.
>
> All we have here is a
On 7/22/2014 11:37 AM, Ameya Aware wrote:
> i am running into java heap space issue. Please see below log.
All we have here is an out of memory exception. It is impossible to
know *why* you are out of memory from the exception. With enough
investigation, we could determine the area of c
Hi
i am running into java heap space issue. Please see below log.
ERROR - 2014-07-22 11:38:59.370; org.apache.solr.common.SolrException;
null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
at
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java
OutOfMemoryError: Java heap space in Solr
To: solr-user@lucene.apache.org
Date: Wednesday, 9 July, 2014, 9:24 PM
On 7/9/2014 6:02 AM,
yuvaraj ponnuswamy wrote:
> Hi,
>
> I am getting the
OutofMemory Error: "java.lang.OutOfMemoryError: Java
heap space" often in production du
On 7/9/2014 6:02 AM, yuvaraj ponnuswamy wrote:
> Hi,
>
> I am getting the OutofMemory Error: "java.lang.OutOfMemoryError: Java heap
> space" often in production due to the particular Treemap is taking more
> memory in the JVM.
>
> When i looked into the config file
Hi,
I am getting the OutofMemory Error: "java.lang.OutOfMemoryError: Java heap
space" often in production due to the particular Treemap is taking more memory
in the JVM.
When i looked into the config files I am having the entity called
UserQryDocument where i am fetching the data fr
> >
> > >
> > >
> > > On Wednesday, March 12, 2014 10:53 PM, Richard Marquina Lopez <
> > > richard.marqu...@gmail.com> wrote:
> > >
> > > Hi,
> > >
> > > I have some problems when execute the delta
gmail.com> wrote:
> >
> > Hi,
> >
> > I have some problems when execute the delta import with 2 million of rows
> > from mysql database:
> >
> > java.lang.OutOfMemoryError: Java heap space
> > at java.nio.HeapCharBuffer.(HeapCharBuffer.java:57
10:53 PM, Richard Marquina Lopez <
> richard.marqu...@gmail.com> wrote:
>
> Hi,
>
> I have some problems when execute the delta import with 2 million of rows
> from mysql database:
>
> java.lang.OutOfMemoryError: Java heap space
> at java.nio.HeapCharBuffer.
million of rows
from mysql database:
java.lang.OutOfMemoryError: Java heap space
at java.nio.HeapCharBuffer.(HeapCharBuffer.java:57)
at java.nio.CharBuffer.allocate(CharBuffer.java:331)
at java.nio.charset.CharsetDecoder.decode(CharsetDecoder.java:777)
at
Hi,
I have some problems when execute the delta import with 2 million of rows
from mysql database:
java.lang.OutOfMemoryError: Java heap space
at java.nio.HeapCharBuffer.(HeapCharBuffer.java:57)
at java.nio.CharBuffer.allocate(CharBuffer.java:331)
at
hi,
heap problem is due to memory full.
you should remove unnecessary data and restart server once.
On Thursday, 6 March 2014 10:39 AM, Angel Tchorbadjiiski
wrote:
Hi Shawn,
a big thanks for the long and detailed answer. I am aware of how linux
uses free RAM for caching and the the problem
Hi Shawn,
a big thanks for the long and detailed answer. I am aware of how linux
uses free RAM for caching and the the problems related to jvm and GC. It
is nice to hear how this correlates to Solr. I'll take some time and
think over it. The facet.method=enum and probably a combination of
Doc
On 3/5/2014 4:40 AM, Angel Tchorbadjiiski wrote:
> Hi Shawn,
>
> On 05.03.2014 10:05, Angel Tchorbadjiiski wrote:
>> Hi Shawn,
>>
>>> It may be your facets that are killing you here. As Toke mentioned, you
>>> have not indicated what your max heap is.20 separate facet fields with
>>> millions of
Hi Shawn,
On 05.03.2014 10:05, Angel Tchorbadjiiski wrote:
Hi Shawn,
It may be your facets that are killing you here. As Toke mentioned, you
have not indicated what your max heap is.20 separate facet fields with
millions of documents will use a lot of fieldcache memory if you use the
standard
On 05.03.2014 11:51, Toke Eskildsen wrote:
On Wed, 2014-03-05 at 09:59 +0100, Angel Tchorbadjiiski wrote:
On 04.03.2014 11:20, Toke Eskildsen wrote:
Angel Tchorbadjiiski [angel.tchorbadjii...@antibodies-online.com] wrote:
[Single shard / 2 cores Solr 4.6.1, 65M docs / 50GB, 20 facet fields]
On Wed, 2014-03-05 at 09:59 +0100, Angel Tchorbadjiiski wrote:
> On 04.03.2014 11:20, Toke Eskildsen wrote:
> > Angel Tchorbadjiiski [angel.tchorbadjii...@antibodies-online.com] wrote:
> >
> > [Single shard / 2 cores Solr 4.6.1, 65M docs / 50GB, 20 facet fields]
> >
> >> The OS in use is a 64bit li
Hi Shawn,
It may be your facets that are killing you here. As Toke mentioned, you
have not indicated what your max heap is.20 separate facet fields with
millions of documents will use a lot of fieldcache memory if you use the
standard facet.method, fc.
Try adding facet.method=enum to all your
:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:735)
...
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:724)
Caused by
On 3/4/2014 2:23 AM, Angel Tchorbadjiiski wrote:
in the last couple of weeks one of my machines is experiencing
OutOfMemoryError: Java heap space errors. In a couple of hours after
starting the SOLR instance queries with execution times of unter 100ms
need more than 10s to execute and many
: Here the complete error OutOfMemoryError message:
> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:735)
Hello list,
in the last couple of weeks one of my machines is experiencing
OutOfMemoryError: Java heap space errors. In a couple of hours after
starting the SOLR instance queries with execution times of unter 100ms
need more than 10s to execute and many Java heap space erros appear in
the
take 400,000,000 doc, it will oom at
>> facet query. the facet field was token by space.
>>
>> May 27, 2013 11:12:55 AM org.apache.solr.common.SolrException log
>> SEVERE: null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java
>> heap space
>> a
.
>
> May 27, 2013 11:12:55 AM org.apache.solr.common.SolrException log
> SEVERE: null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java
> heap space
> at
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatc
: null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java
heap space
at
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:653)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:366)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter
aah… was doing a facet on a double field which was having 6 decimal places…
No surprise that the lucene cache got full…
.z/ahoor
On 17-May-2013, at 11:56 PM, J Mohamed Zahoor wrote:
> Memory increase a lot with queries which have facets…
>
>
> ./Zahoor
>
>
> On 17-May-2013, at 10:00 PM, S
Memory increase a lot with queries which have facets…
./Zahoor
On 17-May-2013, at 10:00 PM, Shawn Heisey wrote:
> On 5/17/2013 1:17 AM, J Mohamed Zahoor wrote:
>> I moved to 4.2.1 from 4.1 recently.. everything was working fine until i
>> added few more stats query..
>> Now i am getting thi
On 5/17/2013 1:17 AM, J Mohamed Zahoor wrote:
> I moved to 4.2.1 from 4.1 recently.. everything was working fine until i
> added few more stats query..
> Now i am getting this error frequently that solr does not run even for 2
> minutes continuously.
> All 5GB is getting used instantaneously in f
ed few more stats query..
> Now i am getting this error frequently that solr does not run even for 2
> minutes continuously.
> All 5GB is getting used instantaneously in few queries...
>
>
> SEVERE: null:java.lang.RuntimeException: java.lang.OutOfMemoryE
: null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap
space
at
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:653)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:366)
at
I'm embarrassed (but hugely relieved) to say that, the script I had for
starting Jetty had a bug in the way it set java options! So, my heap
start/max was always set at the default. I did end up using jconsole and
learned quite a bit from that too.
Thanks for your help Yonik :)
Matt
On Sat, Jan
On Sat, Jan 16, 2010 at 11:04 AM, Matt Mitchell wrote:
> These are single valued fields. Strings and integers. Is there more specific
> info I could post to help diagnose what might be happening?
Faceting on either should currently take ~24MB (6M docs @ 4 bytes per
doc + size_of_unique_values)
Wi
These are single valued fields. Strings and integers. Is there more specific
info I could post to help diagnose what might be happening?
Thanks!
Matt
On Sat, Jan 16, 2010 at 10:42 AM, Yonik Seeley
wrote:
> On Sat, Jan 16, 2010 at 10:01 AM, Matt Mitchell
> wrote:
> > I have an index with more tha
On Sat, Jan 16, 2010 at 10:01 AM, Matt Mitchell wrote:
> I have an index with more than 6 million docs. All is well, until I turn on
> faceting and specify a facet.field. There is only about unique 20 values for
> this particular facet throughout the entire index.
Hmmm, that doesn't sound right..
I have an index with more than 6 million docs. All is well, until I turn on
faceting and specify a facet.field. There is only about unique 20 values for
this particular facet throughout the entire index. I was able to make things
a little better by using facet.method=enum. That seems to work, until
it using the "post.jar" tool in example\exampledocs,
> I get a "out of java heap space" error in the SimplePostTool
> application.
>
>
>
> Any ideas how to fix this? Passing in "-Xms1024M" does not fix it.
>
>
>
> Feroze.
>
>
>
Hi!
I downloaded SOLR and am trying to index an XML file. This XML file is
huge (500M).
When I try to index it using the "post.jar" tool in example\exampledocs,
I get a "out of java heap space" error in the SimplePostTool
application.
Any ideas how to fix this? P
On Thu, Apr 16, 2009 at 10:31 AM, Mani Kumar wrote:
> Aah, Bryan you got it ... Thanks!
> Noble: so i can hope that it'll be fixed soon :) thank you for fixing it
> ...
> please lemme know when its done..
>
This is fixed in trunk. The next nightly build should have this fix.
--
Regards,
Shalin
Aah, Bryan you got it ... Thanks!
Noble: so i can hope that it'll be fixed soon :) thank you for fixing it ...
please lemme know when its done..
Thanks!
Mani Kumar
2009/4/16 Noble Paul നോബിള് नोब्ळ्
> Hi Bryan,
> Thanks a lot. It is invoking the wrong method
>
> it should have been
> bsz = con
Hi Bryan,
Thanks a lot. It is invoking the wrong method
it should have been
bsz = context.getVariableResolver().replaceTokens(bsz);
it was a silly mistake
--Noble
On Thu, Apr 16, 2009 at 2:13 AM, Bryan Talbot wrote:
> I think there is a bug in the 1.4 daily builds of data import handler which
I think there is a bug in the 1.4 daily builds of data import handler
which is causing the batchSize parameter to be ignored. This was
probably introduced with more recent patches to resolve variables.
The affected code is in JdbcDataSource.java
String bsz = initProps.getProperty("batch
DIH streams 1 row at a time.
DIH is just a component in Solr. Solr indexing also takes a lot of memory
On Tue, Apr 14, 2009 at 12:02 PM, Mani Kumar wrote:
> Yes its throwing the same OOM error and from same place...
> yes i will try increasing the size ... just curious : how this dataimport
> wo
Yes its throwing the same OOM error and from same place...
yes i will try increasing the size ... just curious : how this dataimport
works?
Does it loads the whole table into memory?
Is there any estimate about how much memory it needs to create index for 1GB
of data.
thx
mani
On Tue, Apr 14, 2
On Tue, Apr 14, 2009 at 11:36 AM, Mani Kumar wrote:
> Hi Shalin:
> yes i tried with batchSize="-1" parameter as well
>
> here the config i tried with
>
>
>
> driver="com.mysql.jdbc.Driver"
> url="jdbc:mysql://localhost/mydb_development"
> user="root" password="**" />
>
>
> I hope i have u
Hi Shalin:
yes i tried with batchSize="-1" parameter as well
here the config i tried with
I hope i have used batchSize parameter @ right place.
Thanks!
Mani Kumar
On Tue, Apr 14, 2009 at 11:24 AM, Shalin Shekhar Mangar <
s
On Tue, Apr 14, 2009 at 11:18 AM, Mani Kumar wrote:
> Here is the stack trace:
>
> notice in stack trace * "at
> com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1749)"*
>
> It looks like that its trying to read whole table into memory at a time. n
> thts y getting OOM.
>
>
Mani, the data-
/bin/startup.sh -Xmn50M -Xms300M -Xmx400M
> >>>
> >>> I also tried tricks given on
> >>> http://wiki.apache.org/solr/DataImportHandlerFaq page.
> >>>
> >>> what shou
taImporter
doFullImport
SEVERE: Full Import failed
org.apache.solr.handler.dataimport.DataImportHandlerException:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:400)
at
org.apache.solr.handler.dataimport.DocBuilder.
ava.lang.OutOfMemoryError: Java heap space
>>>>> at
>>>>>
>>>>>
>>>>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:400)
>>>>
>>>>> at
>>>>>
>>>>
ail.com> wrote:
>>
>> On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar>>
>>>> wrote:
>>>>
>>>
>>> Hi All,
>>>> I am trying to setup a Solr instance on my macbook.
>>>>
>>>> I get following e
sed by: java.lang.OutOfMemoryError: Java heap space
at com.mysql.jdbc.Buffer.(Buffer.java:58)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1444)
at com.mysql.jdbc.MysqlIO.readSingleRowSet(MysqlIO.java:2840)
How much heap size have you allocated to the jvm?
Also see http://wiki.apache.or
ng errors when m trying to do a full db import ... please
> help
> > me on this
> >
> > java.lang.OutOfMemoryError: Java heap space
> >at
> >
> >
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:400)
> >at
On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar wrote:
> Hi All,
> I am trying to setup a Solr instance on my macbook.
>
> I get following errors when m trying to do a full db import ... please help
> me on this
>
> java.lang.OutOfMemoryError: Java h
ction(): 319
> Apr 13, 2009 11:53:32 PM org.apache.solr.handler.dataimport.DataImporter
> doFullImport
> SEVERE: Full Import failed
> org.apache.solr.handler.dataimport.DataImportHandlerException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.solr.handler.dataimpor
org.apache.solr.handler.dataimport.DataImportHandlerException:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:400)
at
org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:221)
at
Thanks to all who responded. Things are running well! The IBM version
of the JRE for Intel 64 seems to run good, and the stalling issue has
dissappeared.
(when the solr instance stops responding and freezes up)
What I learned is that solr is a great product but needs "tuning" to fit the
usage.
>Install the AMD64 version. (Confusingly, AMD64 is a spec name for
>EM64T, which is now what both AMD and Intel use)
>If that still doesn't work, is it possible that your machine/kernel is
>not set up to support 64 bit?
I was confused by the naming convention. Seems to work fine now, well,
I me
But on Intel, where I'm having the problem it shows:
java version "1.6.0_10-ea"
Java(TM) SE Runtime Environment (build 1.6.0_10-ea-b10)
Java HotSpot(TM) Server VM (build 11.0-b09, mixed mode)
I can't seem to find the Intel 64 bit JDK binary, can you pls. send
me the link?
I was downloading f
>We use 10GB of ram in one of our solr installs. You need to make sure
>your java is 64 bit though. Alex, what does your java -version show?
>Mine shows
>java version "1.6.0_03"
>Java(TM) SE Runtime Environment (build 1.6.0_03-b05)
>Java HotSpot(TM) 64-Bit Server VM (build 1.6.0_03-b05, mixed m
On Jan 28, 2008, at 7:06 PM, Leonardo Santagada wrote:
On 28/01/2008, at 20:44, Alex Benjamen wrote:
I could allocate more physical memory, but I can't seem to increase
the -Xmx option to 3800 I get
an error : "Could not reserve enough space for object heap", even
though I have more than
On 28/01/2008, at 20:44, Alex Benjamen wrote:
Note: I have browsed, searched the forums for this error and
followed the most common advice of
increasing the memory allocation for the JVM:
/usr/bin/java -DSTOP.PORT=8805 -DSTOP.KEY=solrstop -Xmx3584M -
Xms1024M -jar start.jar
[snip]
I cou
e when I get a 500 error on
the solr/ping. But this is
ugly and bad for cache... Any ideas?
Thanks in advance!
-Alex
Jan 24, 2008 3:25:44 PM org.apache.solr.core.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
at org.apache.sol
: I am new in Solr and try to use Jitty and example with 13 million records.
: During running it, I have the error -
: java.lang.OutOfMemoryError: Java heap space
: Any recommendation? We have a million transactions, so would it be better to
: use Tomcat?
millions of records takes up memory
rote:
>
> I am new in Solr and try to use Jitty and example with 13 million records.
> During running it, I have the error -
> *HTTP ERROR: 500*
>
> Java heap space
>
>
>
> java.lang.OutOfMemoryError: Java heap space
>
>
> Any recommendation? We have a mill
I am new in Solr and try to use Jitty and example with 13 million records.
During running it, I have the error -
*HTTP ERROR: 500*
Java heap space
java.lang.OutOfMemoryError: Java heap space
Any recommendation? We have a million transactions, so would it be better to
use Tomcat?
Thanks
On 5/15/06, Marcus Stratmann <[EMAIL PROTECTED]> wrote:
The only situation I get OutOfMemory
errors is after an optimize when the server performs an auto-warming
of the cahces:
A single filter that is big enough to be represented as a bitset
(>3000 in general) will take up 1.3MB
Some ways to
ine now. The only situation I get OutOfMemory
errors is after an optimize when the server performs an auto-warming
of the cahces:
SEVERE: Error during auto-warming of key:[EMAIL
PROTECTED]:java.lang.OutOfMemoryError: Java heap space
(from the tomcat log)
But nevertheless the server seems to run
Sorry, hit the wrong key before...
FYI, I have just committed all the changes related to the Jetty downgrade
into SVN.
Let me know if you notice anything problems.
Bill
On 5/9/06, Bill Au <[EMAIL PROTECTED]> wrote:
FYI, I have just committed the a
On 5/8/06, Bill Au <[EMAIL PROTECTED]> wrot
FYI, I have just committed the a
On 5/8/06, Bill Au <[EMAIL PROTECTED]> wrote:
I was able to produce an OutOfMemoryError using Yonik's python script with
Jetty 6.
I was not able to do so with Jetty 5.1.11RC0, the latest stable version.
So that's the
version of Jetty with which I will downgrade
I was able to produce an OutOfMemoryError using Yonik's python script with
Jetty 6.
I was not able to do so with Jetty 5.1.11RC0, the latest stable version. So
that's the
version of Jetty with which I will downgrade the Solr example app to.
Bill
On 5/5/06, Erik Hatcher <[EMAIL PROTECTED]> wrote
Along these lines, locally I've been using the latest stable version
of Jetty and it has worked fine, but I did see an "out of memory"
exception the other day but have not seen it since so I'm not sure
what caused it.
Moving to Tomcat, as long as we can configure it to be as lightweight
a
There seems to be a fair number of folks using the jetty with the example
app
as oppose to using Solr with their own appserver. So I think it is best to
use a stable version of Jetty instead of the beta. If no one objects, I can
go ahead and take care of this.
Bill
On 5/4/06, Yonik Seeley <[EM
I verified that Tomcat 5.5.17 doesn't experience this problem.
-Yonik
On 5/4/06, Yonik Seeley <[EMAIL PROTECTED]> wrote:
On 5/3/06, Yonik Seeley <[EMAIL PROTECTED]> wrote:
> I just tried sending in 100,000 deletes and it didn't cause a problem:
> the memory grew from 22M to 30M.
>
> Random thou
On 5/3/06, Yonik Seeley <[EMAIL PROTECTED]> wrote:
I just tried sending in 100,000 deletes and it didn't cause a problem:
the memory grew from 22M to 30M.
Random thought: perhaps it has something to do with how you are
sending your requests?
Yep, I was able to reproduce a memory problem w/ Jet
Chris Hostetter wrote:
This is because building a full Solr distribution from scratch requires
that you have JUnit. Bt it is not required to run Solr.
Ah, I see. That was a very valuable hint for me.
I was able now to compile an older revision (393957). Testing this
revision I was able to dele
I just tried sending in 100,000 deletes and it didn't cause a problem:
the memory grew from 22M to 30M.
Random thought: perhaps it has something to do with how you are
sending your requests?
If the client creates a new connection for each request, but doesn't
send the Connection:close header or c
: next thing I tried was to get the code via svn. Unfortunately the code
: does not compile ("package junit.framework does not exist"). I found out
This is because building a full Solr distribution from scratch requires
that you have JUnit. Bt it is not required to run Solr.
: > Is your problem
Yonik Seeley wrote:
Is your problem reproducable with a test case you can share?
Well, you can get the configuration files. If you ask for the data, this
could be a problem, since this is "real" data from our production
database. The amount of data needed could be another problem.
You could al
Hi Marcus,
Is your problem reproducable with a test case you can share?
You could also try a different app-server like Tomcat to see if that
makes a difference.
What type is your id field defined to be?
-Yonik
On 5/3/06, Marcus Stratmann <[EMAIL PROTECTED]> wrote:
Hello,
deleting or updating
1 - 100 of 111 matches
Mail list logo