how to split /mlt to multiple replicas.
i know that MoreLikeThisHandler is not working on multishards solrcloud setup, bcs it only returns documents where the source document is found. but i want just to split load over 2 replicas of the same shard, but when i run query it runs only on replica to which i send request. i tried to add shards parameter to query, but it works the same way(only one shard). -- View this message in context: http://lucene.472066.n3.nabble.com/how-to-split-mlt-to-multiple-replicas-tp4279088.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: mlt and document boost
So no way to apply boost to mlt or any other way to change order of document in mlt result? also may be there is a way to make to mlt query at once and merge. -- View this message in context: http://lucene.472066.n3.nabble.com/mlt-and-document-boost-tp4246522p4247154.html Sent from the Solr - User mailing list archive at Nabble.com.
mlt and document boost
i use Morelikethis query and i need to boost some documents at query time i've tried to use fq and ^ boost, it does not work. -- View this message in context: http://lucene.472066.n3.nabble.com/mlt-and-document-boost-tp4246522.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: adding document with nested document require to set id
How to do this? -- View this message in context: http://lucene.472066.n3.nabble.com/adding-document-with-nested-document-require-to-set-id-tp4240908p4241091.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: adding document with nested document require to set id
How exactly you are doing that? Doing what? this is from schema. id ... this is from config i want to store in nested document multiple values that should be grouped together, like pages ids and pages urls -- View this message in context: http://lucene.472066.n3.nabble.com/adding-document-with-nested-document-require-to-set-id-tp4240908p4241058.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: adding document with nested document require to set id
if i add document without nesting then id is generated automatically(i use uuid), and this was working perfectly until i tryed to add nesting. i want the same behaviour for nested documents as it was for not nested. -- View this message in context: http://lucene.472066.n3.nabble.com/adding-document-with-nested-document-require-to-set-id-tp4240908p4240979.html Sent from the Solr - User mailing list archive at Nabble.com.
adding document with nested document require to set id
i'm trying to add document with the nested objects but don't want id to be generated automatically. When i add document without nesting it's ok.But if i add _childDocuments_ there is an error [doc=null] missing required field: id -- View this message in context: http://lucene.472066.n3.nabble.com/adding-document-with-nested-document-require-to-set-id-tp4240908.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr-8983-console.log is huge
i use solr cloud. -- View this message in context: http://lucene.472066.n3.nabble.com/solr-8983-console-log-is-huge-tp4238613p4239100.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr-8983-console.log is huge
Thx for answers. Is there any way to reload log4j.properities without restarting solr? -- View this message in context: http://lucene.472066.n3.nabble.com/solr-8983-console-log-is-huge-tp4238613p4239067.html Sent from the Solr - User mailing list archive at Nabble.com.
solr-8983-console.log is huge
That log file is constantly growing. And it is now ~60GB. what can i change to fix this? -- View this message in context: http://lucene.472066.n3.nabble.com/solr-8983-console-log-is-huge-tp4238613.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Lose Solr config on zookeeper when it is restarted
i have one instance of solr. the thing is when i create collection the running solr is used but when i upload config i use zkcli -- View this message in context: http://lucene.472066.n3.nabble.com/Lose-Solr-config-on-zookeeper-when-it-is-restarted-tp421p4233626.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Lose Solr config on zookeeper when it is restarted
zk is stand-alone. But i think solr node is ephimeral. -- View this message in context: http://lucene.472066.n3.nabble.com/Lose-Solr-config-on-zookeeper-when-it-is-restarted-tp421p4233376.html Sent from the Solr - User mailing list archive at Nabble.com.
Lose Solr config on zookeeper when it is restarted
sometimes when Zookeeper( single mode) is restarted it lose solr collections. Furthermore when i manually upload it again then no state.json is created in collection but clusterstate.json is created instead. i use solr 5.1.0 -- View this message in context: http://lucene.472066.n3.nabble.com/Lose-Solr-config-on-zookeeper-when-it-is-restarted-tp421.html Sent from the Solr - User mailing list archive at Nabble.com.
is there a way to remove deleted documents from index without optimize
my index is updating frequently and i need to remove unused documents from index after update/reindex. Optimizaion is very expensive so what should i do? -- View this message in context: http://lucene.472066.n3.nabble.com/is-there-a-way-to-remove-deleted-documents-from-index-without-optimize-tp4230691.html Sent from the Solr - User mailing list archive at Nabble.com.
response time grows when update/commit
we are using solr as nosql database and periodically update large amount of document and then commit changes(using commitwithin).and query response time grows at least twice when it is happening.what can we do with this -- View this message in context: http://lucene.472066.n3.nabble.com/response-time-grows-when-update-commit-tp4228063.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: how to prevent uuid-field changing in /update query?
It sounds like you need to control when the uuid is and is not created, just feels like you'd get better mileage doing this outside of solr Can I simply insert a condition(blank or not ) in uuid update-chain? -- View this message in context: http://lucene.472066.n3.nabble.com/how-to-prevent-uuid-field-changing-in-update-query-tp4225113p4225141.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: how to prevent uuid-field changing in /update query?
Why not generate the uuid client side on the initial save and reuse this on updates? i can't do this because i have delta-import queries which also should be able to assign uuid when it is needed -- View this message in context: http://lucene.472066.n3.nabble.com/how-to-prevent-uuid-field-changing-in-update-query-tp4225113p4225137.html Sent from the Solr - User mailing list archive at Nabble.com.
how to prevent uuid-field changing in /update query?
i have uuid field. it is not set as unique, but nevertheless i want it not to be changed every time when i call /update. it might be because i added requesthandler with name "/update" which contains uuid update срфшт .But if i not do this i have no uuid at all.May be i can config uuid update-chain to set uuid only if it is blank? -- View this message in context: http://lucene.472066.n3.nabble.com/how-to-prevent-uuid-field-changing-in-update-query-tp4225113.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: DIH delta-import pk
i have autogenerated uuid for each document in solr. it is not marked as uniquefield. i add uuid in config to generate uuid when i add document from client. But now each time i update document uuid is changed. -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224849.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: DIH delta-import pk
i don't use SQL now. i'm adding documents manually. db_id_s -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224762.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: DIH delta-import pk
Now I set db id as unique field and uuid field,which should be generated automatically as required. but when i add document i have an error that my required uuid field is missing. -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224701.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: DIH delta-import pk
As far as I understand I cant use 2 uniquefield. i need db id and uuid because i moving data from database to solr index entirely. And temporaly i need it to be compatble with delta-import, but in future i will use new only uuid . -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224699.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr add document
thx i just need to call solr.commit -- View this message in context: http://lucene.472066.n3.nabble.com/solr-add-document-tp4224480p4224698.html Sent from the Solr - User mailing list archive at Nabble.com.
solr add document
i add document to solr(cloud) using rsolr. it returns "responseHeader" => { "status" => 0, "QTime" => 5 } }. But when i search for added document nothing is found. -- View this message in context: http://lucene.472066.n3.nabble.com/solr-add-document-tp4224480.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: DIH delta-import pk
ok, can I use 2 unique fields one with uuid and one with db id? what will happened then? -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342p4224395.html Sent from the Solr - User mailing list archive at Nabble.com.
DIH delta-import pk
i have a DIH delta-import query based on last_index_time.it works perfectly But sometimes i add documents to Solr manually and i want DIH not to add them again.I have UUID unique field and also i have "id" from database which is marked as pk in DIH schema. my question is : will DIH update existing document or add new one? p.s. id field is not marked as unique in config -- View this message in context: http://lucene.472066.n3.nabble.com/DIH-delta-import-pk-tp4224342.html Sent from the Solr - User mailing list archive at Nabble.com.