We are using Solr version 4.4
--
View this message in context:
http://lucene.472066.n3.nabble.com/java-io-EOFException-seek-past-EOF-tp4137817p4137959.html
Sent from the Solr - User mailing list archive at Nabble.com.
Hi, Solr Developer
Thanks very much for your timely reply.
1. I'm sorry, I have made a mistake, the total number of documents is 32
Million, not 320 Million.
2. The system memory is large for solr index, OS total has 256G, I set the
solr tomcat HEAPSIZE="-Xms25G -Xmx100G"
-How many fields
Hey all,
I've got a number of nodes (Solr 4.4 Cloud) that I'm balancing with HaProxy for
queries. I'm indexing pretty much constantly, and have autoCommit and
autoSoftCommit on for Near Realtime Searching. All works nicely, except that
occasionally the auto-commit cycles are far enough off th
Tomcat setup is fine. I insist that it's Solr's issue. The whole index
consists of Greek (funny characters) and solr returns them normally. The
problem here is that I cannot concatenate Greek characters in
"data-config.xml" (hard-coded).
--
View this message in context:
http://lucene.472066.n3.
bq: I think that it happens at index time
How do you know that? If you're looking at the results in a browser
you do _not_ know that. If you're looking at the raw values in, say,
SolrJ then you _might_ know that, there's still the issue of whether
you're sending the docs to Solr and your servlet c
You'll have better luck asking the folks at OpenNLP. This isn't really a
Solr question.
On Fri, May 23, 2014 at 6:38 PM, rashi gandhi wrote:
> HI,
>
>
>
> I have one running solr core with some data indexed on solr server.
>
> This core is designed to provide OpenNLP functionalities for indexin
Are you sure that you compiled your code with the proper Solr jars so that
the class signature (extends, implements, and constructors) matches the Solr
4.7.2 jars? I mean, Java is simply complaining that your class is not a
valid value source class of the specified type.
-- Jack Krupansky
---
There's an example of using curl to make a REST call to update a core on
this page:
https://wiki.apache.org/solr/UpdateXmlMessages
If that doesn't help, please let us know what error you're receiving.
Michael Della Bitta
Applications Developer
o: +1 646 532 3062
appinions inc.
“The Science
I think that it happens at index time. The reason is that when i query for
the specific field solr returns the "?" string!
--
View this message in context:
http://lucene.472066.n3.nabble.com/Import-data-from-Mysql-concat-issues-tp4137814p4137908.html
Sent from the Solr - User mailing list a
What version of Solr are you using? There were some issues like this
in the 4.1 time-frame.
Best,
Erick
On Fri, May 23, 2014 at 3:39 AM, aarthi wrote:
> Hi
> We are getting the seek past EOF exception in solr. This occurs randomly and
> after a reindex we are able to access data again. After run
Couple of possibilities:
1> The data in Solr is fine. However, your browser is getting the
proper characters back but is not set up to handle the proper
character set so displays .
2> Your servlet container is not set up (either inbound or outbound)
to handle the character set you're sending
Hi All,
I have my own popularity value source class
and I let solr know about it via solrconfig.xml
But then I get the following class cast exception
I have tried to make sure there are no old Solr jar files in the classpath.
Why would this be happening ?
I even tried to use the lib tag
张月祥 [zhan...@calis.edu.cn] wrote:
> Could anybody tell us some internals about "Too many values for
> UnInvertedField faceting on field xxx" ?
I must admit I do not fully understand it in detail, but it is a known problem
with Field Cache (facet.method=fc) faceting. The remedy is to use DocValues
Could anybody tell us some internals about "Too many values for
UnInvertedField faceting on field xxx" ?
We have two solr servers.
Solr A :
128G RAM, 60M docs, 2600 different terms with field “code”, every term of
field “code” has fixed length 6.
the sum count of token of field “cod
On Fri, May 23, 2014 at 11:37 AM, Toke Eskildsen
wrote:
> Per Steffensen [st...@designware.dk] wrote:
>> * It IS more efficient to just use the index for the
>> "no_dlng_doc_ind_sto"-part of the request to get doc-ids that match that
>> part and then fetch timestamp-doc-values for those doc-ids t
Per Steffensen [st...@designware.dk] wrote:
> * It IS more efficient to just use the index for the
> "no_dlng_doc_ind_sto"-part of the request to get doc-ids that match that
> part and then fetch timestamp-doc-values for those doc-ids to filter out
> the docs that does not match the "timestamp_dlng
I can answer some of this myself now that I have dived into it to
understand what Solr/Lucene does and to see if it can be done better
* In current Solr/Lucene (or at least in 4.4) indices on both
"no_dlng_doc_ind_sto" and "timestamp_dlng_doc_ind_sto" are used and the
doc-id-sets found are inter
Hello,
I looked to source code of post.jar, that was very interesting.
I looked for manifoldcf apache, that was interesting too.
But i what i want to do is indexing some files using http rest, this is my
request which dont work, maybe this way is the easiest for implementation:
put: localhost:808
Hi,
How can I change the field name in the "grouped" section of the solr
response.
I know for changing the field names in the response where solr returns
documents you can make a query with "fl" changed as
"fl=mapping1:fieldname1,mapping2:fieldname2"
How do I achieve the same thing for "grouping"
Hey Anass,
Have look at another Apache project : http://manifoldcf.apache.org
It works with Tomcat/Solr. It is handy to handle deletions and incremental
updates.
On Friday, May 23, 2014 3:41 PM, benjelloun wrote:
Hello,
There is no inconvenience, i just need to index some files from the s
Feel free to look at the source code for post.jar. I mean, all it is really
doing is scanning the directory (optionally recursively) and then streaming
each file to Solr.
-- Jack Krupansky
-Original Message-
From: benjelloun
Sent: Friday, May 23, 2014 8:15 AM
To: solr-user@lucene.apa
HI,
I have one running solr core with some data indexed on solr server.
This core is designed to provide OpenNLP functionalities for indexing and
searching.
So I have kept following binary models at this location:
*\apache-tomcat-7.0.53\solr\collection1\conf\opennlp
*
· en-sent.bin
Hello,
There is no inconvenience, i just need to index some files from the system
using JEE and tomcat6, maybe there is a fonction which call HTTP REST.
Maybe there is a solution to integrate post.jar to tomcat6.
Please if you know any solution to my probleme, suggest it to me.
Thanks,
Best regar
Hi
We are getting the seek past EOF exception in solr. This occurs randomly and
after a reindex we are able to access data again. After running Check Index,
we got no corrupt blocks. Kindly throw light on the issue.The following is
the error log:
2014-05-21 13:57:29,172 INFO processor.LogUpdatePr
Is there a particular reason you are adverse to using post.jar? I mean, if
there is some bug or inconvenience, let us know so we can fix it!
The Solr server itself does not provide any ability to "crawl" file systems
(LucidWorks Search does.) post.jar does provide that convenience.
-- Jack Kr
There is no direct Solr configuration option to disable commit requests that
I know of.
Maybe you could do it with an update processor. The ProcessAdd method is
called to process a document; it is passed an AddUpdateCommand object for a
single document and has a field for the commitWithin sett
Hi,
I'm trying to index data from mysql. The indexing is successful. Then I
tried to use the mysql concat function (data-config.xml) in order to
concatenate a custom string with a field like this: *CONCAT('(',
CAST('ΤΜΗΜΑ' AS CHAR CHARACTER SET utf8), ' ', apofasi_tmima, ')') *. The
custom string
Post jar is just there for convenience. Look at the relevant WIKI
pages for actual URL examples:
https://wiki.apache.org/solr/UpdateXmlMessages
Regards,
Alex
Personal website: http://www.outerthoughts.com/
Current project: http://www.solr-start.com/ - Accelerating your Solr proficiency
On Fri
Hi Michael;
I've written an API that users send their request. I resend their queries
into Solr and manage which collection is theirs and drop query parameters
about commit. However users can send commitWithin option within their
request data and I have to analyze the data inside request to disall
Hello,
I need to index a repository of documents(.doc) without using post.jar, i'm
using Solr with Tomcat6.
maybe its with http REST api, but how to use it?
Thanks for your answer,
Best regards,
Anass BENJELLOUN
--
View this message in context:
http://lucene.472066.n3.nabble.com/index-a-repos
Hello,
I need to index a repository of documents(.doc) without using post.jar, i'm
using Solr with Tomcat6.
maybe its with http REST api, but how to use it?
Thanks for your answer,
Best regards,
Anass BENJELLOUN
--
View this message in context:
http://lucene.472066.n3.nabble.com/index-a-repos
31 matches
Mail list logo