-user@lucene.apache.org
Objet : Nested Documents without using "type" field ? Possible or Not ?
Hello,
I would like to use SOLR to index the Cooperative Patent Classification,
The CPC has a hierarchical structure and it can have more than 20 level.
It's a basic structure without Type of n
Hello,
I would like to use SOLR to index the Cooperative Patent Classification,
The CPC has a hierarchical structure and it can have more than 20 level.
It's a basic structure without Type of nested doc.
i.e:
A -> A01 -> A01B -> A01B3/00 -> A01B3/40 -> A01B3/4025 .
A -> A01 -> A01L -> A01L1
Thanks for the link,
So, I launch this post, I will see on Monday if it will ok :)
Le 05/06/2015 17:21, Alessandro Benedetti a écrit :
I can not see any problem in that, but talking about commits I would like
to make a difference between "Hard" and "Soft" .
Hard commit -> durability
Soft commi
Ok thanks for these information !
Le 05/06/2015 17:37, Erick Erickson a écrit :
Picking up on Alessandro's point. While you can post all these docs
and commit at the end, unless you do a hard commit (
openSearcher=true or false doesn't matter), then if your server should
abnormally terminate for
Picking up on Alessandro's point. While you can post all these docs
and commit at the end, unless you do a hard commit (
openSearcher=true or false doesn't matter), then if your server should
abnormally terminate for _any_ reason, all these docs will be
replayed on startup from the transaction log.
I can not see any problem in that, but talking about commits I would like
to make a difference between "Hard" and "Soft" .
Hard commit -> durability
Soft commit -> visibility
I suggest you this interesting reading :
https://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-
Hi Alessandro,
I'm actually on my dev' computer, so I would like to post 1 000 000 xml
file (with a structure defined in my schema.xml)
I have already import 1 000 000 xml files by using
bin/post -c mydb /DATA0/1 /DATA0/2 /DATA0/3 /DATA0/4 /DATA0/5
where /DATA0/X contains 20 000 xml files (I d
Hi Bruno,
I can not see what is your challenge.
Of course you can index your data in the flavour you want and do a commit
whenever you want…
Are those xml Solr xml ?
If not you would need to use the DIH, the extract update handler or any
custom Indexer application.
Maybe I missed your point…
Give m
Dear Solr Users,
I would like to post 1 000 000 records (1 records = 1 files) in one shoot ?
and do the commit and the end.
Is it possible to do that ?
I've several directories with each 20 000 files inside.
I would like to do:
bin/post -c mydb /DATA
under DATA I have
/DATA/1/*.xml (20 000 fi
: Subject: Re: hi. allowLeadingWildcard is it possible or not yet?
:
: i wonder the same thing... so wanna "re-animate" the topic
:
: is it possible?
Leading wildcard style queries can work, and can work very
efficiently, thanks to SOLR-1321.
The key is to use ReversedWildcardFil
i wonder the same thing... so wanna "re-animate" the topic
is it possible?
-
Zeki ama calismiyor... Calissa yapar...
--
View this message in context:
http://lucene.472066.n3.nabble.com/hi-allowLeadingWildcard-is-it-possible-or-not-yet-tp495457p3340838.html
Sent from the Solr - Us
Hi folks,
I am reading this issue and from what I see it's not possible yet to
search with first char wildcard.
http://issues.apache.org/jira/browse/SOLR-218
Are there any workarounds or anyway at all I could allow such search. I
looked into whole 2008,2009 mail archive but couldn't find anything.
12 matches
Mail list logo