It seems to be working now. - I have increased some values in jetty.xml:
*<Set name="responseBufferSize">65536</Set> <Set name="headerBufferSize">32768</Set> * *<Set name="responseBufferSize">32768</Set> * - And have left chunk size = 3000. - But I'm not adding all the documents at the same time; instead items are sent one by one to be indexed (it's either that or the jetty configuration altered which is preventing the previous error from being reproduced). Thanks everyone. 2014-11-24 2:36 GMT+01:00 Alexandre Rafalovitch <arafa...@gmail.com>: > Good point on that one Steve. > > Wireshark is both a hammer and a power drill of network > troubleshooting. Takes steady hands to hold it right (it has a bit of > a learning curve) but it is a great tool. I swore by it (well Ethereal > back then) in my tech support days. > > So, seconded to try using that if the simple approach fails outright. > > Regards, > Alex. > Personal: http://www.outerthoughts.com/ and @arafalov > Solr resources and newsletter: http://www.solr-start.com/ and @solrstart > Solr popularizers community: https://www.linkedin.com/groups?gid=6713853 > > > On 23 November 2014 at 20:31, steve <sc_shep...@hotmail.com> wrote: > > > > > > > > For what it's worth, depending on the type of PC/MAC you're using, you > can use WireShark to look at active http header (sent and received) that > are being created for the request. > > https://www.wireshark.org/ > > I don't have any financial interest in them, but the stuff works! > > Steve > > > >> Date: Sun, 23 Nov 2014 20:47:05 +0100 > >> Subject: Re: Too much data after closed for HttpChannelOverHttp > >> From: h.benoud...@gmail.com > >> To: solr-user@lucene.apache.org > >> > >> Actually I'm using a php client (I think it sends a HTTP request to > Solr), > >> but you're right tomorrow once I'll get to the office, I'll set chunk > size > >> to a smaller value, and will tell you if that was the reason. > >> > >> Thanks. > >> > >> 2014-11-23 19:35 GMT+01:00 Alexandre Rafalovitch <arafa...@gmail.com>: > >> > >> > Most probably just a request that's too large. Have you tried dropping > >> > down to 500 items and seeing what happens? > >> > > >> > Are you using SolrJ to send content to Solr? Or a direct HTTP request? > >> > > >> > Regards, > >> > Alex. > >> > P.s. You may also find it useful to read up on the Solr commit and > >> > hard vs. soft commits. Check solrconfig.xml in the example > >> > distribution. > >> > Personal: http://www.outerthoughts.com/ and @arafalov > >> > Solr resources and newsletter: http://www.solr-start.com/ and > @solrstart > >> > Solr popularizers community: > https://www.linkedin.com/groups?gid=6713853 > >> > > >> > > >> > On 23 November 2014 at 12:31, Hakim Benoudjit <h.benoud...@gmail.com> > >> > wrote: > >> > > Hi there, > >> > > > >> > > I have deployed solr with Jetty, and I'm trying to index a quite > large > >> > > amount of items (300K), retreived from a MySQL database > (unfortunately > >> > I'm > >> > > not using DIH; I'm doing it manually, by getting items from MySQL > and > >> > then > >> > > index them it in Solr). > >> > > > >> > > But, I'm not indexing all of those items at the same time; I'm > indexing > >> > > them by chunks of 3K. > >> > > So, I get the first 3K, index them, then goes to the next 3K chunk > to > >> > index > >> > > it. > >> > > > >> > > Here is the error I got in jetty logs, I guess it has nothing to do > with > >> > > Mysql: > >> > > *Does anyone know the meaning of the error 'badMessage: > >> > > java.lang.IllegalStateException: too much data after closed for > >> > > HttpChannelOverHttp@5432494a' ?* > >> > > > >> > > Thanks for your help, if anything isnt very precise please tell me > to > >> > > explain it (and sorry for my bad english). > >> > > > >> > > -- > >> > > Cordialement, > >> > > Best regards, > >> > > Hakim Benoudjit > >> > > >> > >> > >> > >> -- > >> Cordialement, > >> Best regards, > >> Hakim Benoudjit > > > > > -- Cordialement, Best regards, Hakim Benoudjit