trying to manipulate search result (like further filtering out unwanted), and
ordering the results differently. Where is the suitable place for doing it?
I've been using QueryResponseWriter but that doesn't seem to be the right place.
thanks.
Up to your solr client.
On Mon, Nov 24, 2008 at 1:24 PM, souravm [EMAIL PROTECTED] wrote:
Hi,
Looking for some insight on distributed search.
Say I have an index distributed in 3 boxes and the index contains time and
text data (typical log file). Each box has index for different timeline -
check procedure:
1: rm -r $tomcat/webapps/*
2: rm -r $solr/data ,,,ur index data directory
3: check xml(any xml u modified)
4: start tomcat
i had same error, but i forgot how to fix...so u can use my check procedure,
i think it will help you
i use tomcat+solr in win2003, freebsd, mac osx
first u sure the xml is utf-8,,and field value is utf-8,,
second u should post xml by utf-8
my advice : All encoding use utf-8...
it make my solr work well,,, i use chinese
--
regards
j.L
i find url not same as the others
--
regards
j.L
first, u should escape some string like (code by php)
function escapeChars($string) {
$string = str_replace(, amp;, $string);
$string = str_replace(, lt;, $string);
$string = str_replace(, gt;, $string);
$string = str_replace(', apos;, $string);
$string = str_replace('', quot;, $string);
] wrote: thanks
james... How much time does it take to
index 18m docs? - EswarOn
Nov 27, 2007 7:43 AM, James liu
[EMAIL PROTECTED] wrote: i not
use HYLANDA analyzer. i use
je-analyzer and indexing at least 18m
docs. i m sorry i only use chinese
analyzer.On Nov 27, 2007 10:01
I don't think NGram is good method for Chinese.
CJKAnalyzer of Lucene is 2-Gram.
Eswar K:
if it is chinese analyzer,,i recommend hylanda(www.hylanda.com),,,it is
the best chinese analyzer and it not free.
if u wanna free chinese analyzer, maybe u can try je-analyzer. it have
some problem
if ur analyzer is standard, u can try use tokenize.(u can find the answer
from analyzer source code and schema.xml)
On Nov 27, 2007 9:39 AM, zx zhang [EMAIL PROTECTED] wrote:
lance,
The following is a instance schema fieldtype using solr1.2 and CJK
package.
And it works. As you said, CJK
millions of documents.
James,
We would have a look at hylanda too. What abt japanese and korean
analyzers,
any recommendations?
- Eswar
On Nov 27, 2007 7:21 AM, James liu [EMAIL PROTECTED] wrote:
I don't think NGram is good method for Chinese.
CJKAnalyzer of Lucene is 2-Gram.
Eswar
if I understand correct,,u just do it like that:(i use php)
$data1 = getDataFromInstance1($url);
$data2 = getDataFromInstance2($url);
it just have multi solr Instance. and getData from the distance.
On Nov 12, 2007 11:15 PM, Dilip.TS [EMAIL PROTECTED] wrote:
Hello,
Does SOLR supports
hmm
i find error,,,that is my error not about php and phps ..
i use old config to testso config have a problem..
that is Title i use double as its type...it should use text.
On Nov 8, 2007 10:29 AM, James liu [EMAIL PROTECTED] wrote:
php now is ok..
but phps failed
mycode
9:30 PM, Dave Lewis [EMAIL PROTECTED] wrote:
On Nov 7, 2007, at 2:04 AM, James liu wrote:
i just decrease answer information...and u will see my result(full,
not
part)
*before unserialize*
string(433)
a:2:{s:14:responseHeader;a:3:{s:6:status;i:0;s:5:QTime;i:
0;s:6:params;a:7
, Dave Lewis [EMAIL PROTECTED] wrote:
What are the results of the two var_dumps?
dave
On Nov 5, 2007, at 10:06 PM, James liu wrote:
first: i m sure i enable php and phps in my solrconfig.xml
two: i can't get answer.
*phps:
*?php
$url = '
http://localhost:8080/solr1/select/?
q
afternoon,,i will update svn...and try the newest...
On Nov 7, 2007 11:23 AM, Dave Lewis [EMAIL PROTECTED] wrote:
On Nov 6, 2007, at 8:10 PM, James liu wrote:
first var_dump result(part not all):
string(50506)
a:2:{s:14:responseHeader;a:3:{s:6:status;i:0;s:5:QTime;i:
2906;s:6
same answer.
On Nov 7, 2007 11:41 AM, James liu [EMAIL PROTECTED] wrote:
afternoon,,i will update svn...and try the newest...
On Nov 7, 2007 11:23 AM, Dave Lewis [EMAIL PROTECTED] wrote:
On Nov 6, 2007, at 8:10 PM, James liu wrote:
first var_dump result(part not all
i just decrease answer information...and u will see my result(full, not
part)
*before unserialize*
string(433)
i find they all return string
?php
$url = '
http://localhost:8080/solr/select/?q=solrversion=2.2start=0rows=10indent=onwt=php
';
var_dump(file_get_contents($url);
?
--
regards
jl
'after unserialize...br/';
var_dump($a);
?*
*
On 11/6/07, Stu Hood [EMAIL PROTECTED] wrote:
Did you enable the PHP serialized response writer in your solrconfig.xml?
It is not enabled by default.
Thanks,
Stu
-Original Message-
From: James liu [EMAIL PROTECTED]
Sent: Monday
if u rebuild solr , safe method is rm -r *tomcat*/webapps/*.
2007/11/1, Chris Hostetter [EMAIL PROTECTED]:
: Is there an easy to find out which version of solr is running. I
installed
: solr 1.2 and set up an instance using Tomcat. It was successful before.
FYI: starting a while back, the
where i can read 1.3 new features?
2007/10/26, Venkatraman S [EMAIL PROTECTED]:
On 10/26/07, Mike Klaas [EMAIL PROTECTED] wrote:
If we did a 1.2.x, it shoud (imo) contain no new features, only
important bugfixes.
I have been having a look at the trunk for quite sometime now, and must
i have 40 instances,,,one instance lost segments* file(happen after commit
and optimize)
anyone have similar problem?
can i fix this problem?
can i recovery this instance data?
--
regards
jl
i find it happen when it do commit.
i use solr 1.2 release.
i use crontab to do index work.
2007/10/15, James liu [EMAIL PROTECTED]:
i have 40 instances,,,one instance lost segments* file(happen after commit
and optimize)
anyone have similar problem?
can i fix this problem?
can i
* field name=text type=text indexed=true stored=true
multiValued=true/*
*i think text not need stored='true' unless u will show it.(it will help u
decrease index size and not affect search )*
*index and search use same box? if it is true, u should moniter search
response time when
i can't download it from http://jetty.mortbay.org/jetty5/plus/index.html
--
regards
jl
, at 10:38 AM, James liu wrote:
i wanna do it.
Maybe someone did it, if so, give me some tips.
thks
--
regards
jl
--
Grant Ingersoll
http://lucene.grantingersoll.com
Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http
if use multi solr with one index, it will cache individually.
so i think can it share their cache.(they have same config)
--
regards
jl
are
trying to accomplish.
Thanks,
Grant
On Sep 23, 2007, at 10:38 AM, James liu wrote:
i wanna do it.
Maybe someone did it, if so, give me some tips.
thks
--
regards
jl
--
Grant Ingersoll
http://lucene.grantingersoll.com
Lucene Helpful Hints
i wanna do it.
Maybe someone did it, if so, give me some tips.
thks
--
regards
jl
i wanna try patch:
https://issues.apache.org/jira/browse/SOLR-139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel
and i download solr1.2 release
patch SOLR-269*.pach(when in
'/tmp/apache-solr-1.2.0/src/test/org/apache/solr/update'
)
it show me
|Index:
OK...I see...thk u ,mike.
2007/8/31, Mike Klaas [EMAIL PROTECTED]:
On 29-Aug-07, at 10:21 PM, James liu wrote:
Does it affect with doc size?
for example 2 billion docs, 10k doc2 billion docs, but doc size
is 10m.
There might be other places that have 2G limit (see lucene index
Does it affect with doc size?
for example 2 billion docs, 10k doc2 billion docs, but doc size is 10m.
2007/8/30, Mike Klaas [EMAIL PROTECTED]:
2 billion docs (signed int).
On 29-Aug-07, at 6:24 PM, James liu wrote:
what is the limits for Lucene and Solr.
100m, 1000m, 5000m
if i wanna calc it by my method, something i should notice ?
anyone did it?
--
regards
jl
fieldset topic indexed='false' and stored='true'
i don't know why it will be analyzed?
now i wanna it only store not analyzed,,,how can i do?
--
regards
jl
I correct it,,,i index 17M docs. not 1.7M,,,so OutOfMemory happen when it
finish index ~11.3m docs
It is new index.
i think it maybe the reason:
On 7/18/07, Otis Gospodnetic [EMAIL PROTECTED] wrote:
Why? Too small of a Java heap. :)
Increase the size of the Java heap and lower the
when i index 1.7m docs and 4k-5k per doc.
OutOfMemory happen when it finish index ~1.13m docs
I just restart tomcat , delete all lock and restart do index.
No error or warning infor until it finish.
anyone know why? or have the same error?
--
regards
jl
u can find configuration datadir in solrconfig.xml(solr 1.2)
2007/7/10, nithyavembu [EMAIL PROTECTED]:
Hi,
I tried as you said and got the result without any error. So we can make
the solr home anywhere. But we have to give the path correctly in solr.xml
.
Am i correct?
Now i am one step
I use freebsd.
2007/6/16, Yonik Seeley [EMAIL PROTECTED]:
On 6/14/07, James liu [EMAIL PROTECTED] wrote:
I just timing my script to get data from 2 solr boxes, not complete
script.
It just query two box and return id,score .rows=10. response type use
json.
and i see their qtime all zero
solr) result so I can test?
Thanks
-Nick
On 6/28/07, James liu [EMAIL PROTECTED] wrote:
code not change,,,and i not use utf8_decodeshould do it?
2007/6/28, Nick Jenkin [EMAIL PROTECTED]:
Hi James
It is totally not optimized, when you say change your content into
???, I assume
It is slower than json and xml,,,and it will change my content into ???
when i use json , content is ok.
afternoon, iwill read ur code.
2007/6/27, James liu [EMAIL PROTECTED]:
ok,,thks nick,,,i just forget replace jar file..
wait a minute i will test speed...
2007/6/27, Nick Jenkin
code not change,,,and i not use utf8_decodeshould do it?
2007/6/28, Nick Jenkin [EMAIL PROTECTED]:
Hi James
It is totally not optimized, when you say change your content into
???, I assume this is because of UTF8 issues, are you using
utf8_decode etc?
Thanks
-Nick
On 6/28/07, James liu
i means define it in schema.xml,,,
--
regards
jl
[EMAIL PROTECTED]:
I have some good news :o)
https://issues.apache.org/jira/browse/SOLR-275
Please let me know if you find any bugs
Thanks
-Nick
On 6/26/07, James liu [EMAIL PROTECTED] wrote:
I think it simple to u.
so i wait for ur good news.
2007/6/26, Nick Jenkin [EMAIL PROTECTED
first try it? which system u use?
if u use freebsd, just give up trying. it not fit for freebsd.
2007/6/27, Otis Gospodnetic [EMAIL PROTECTED]:
Hi,
Here is a puzzling one. I can't get Solr to invoke snaphooter
properly. Solr claims my snapshooter is not where I said it is:
SEVERE:
2007/6/27, Mike Klaas [EMAIL PROTECTED]:
On 25-Jun-07, at 10:53 PM, James liu wrote:
[quote]how can i use index all with ram and how to config which ram
i should
use?[/quote]
Your os will automatically load the most frequently-used parts of the
index in ram.
If your total ram
very strange ,only me fail? anyone have same question?
if free, maybe u zip your solr to me by mail...and i try it again.
2007/6/26, Nick Jenkin [EMAIL PROTECTED]:
Interesting, what version of solr are you using, I tested on 1.2.
-Nick
On 6/26/07, James liu [EMAIL PROTECTED] wrote:
i just cp
how about its performance?
2007/6/26, Kijiji Xu, Ping [EMAIL PROTECTED]:
I had solved this problem,below is my POST code,I used HTTP_Request of
PEAR,it's so simple.thank you all very much .FYI;
private function doPost($url,$postData){
$req = new HTTP_Request($url,array(
i use tomcat ,, send ur solr version to me...i try it again..
2007/6/27, Nick Jenkin [EMAIL PROTECTED]:
If you are using the example provided in 1.2 (using jetty) you need to
use ant example
rather than ant dist
-Nick
On 6/27/07, James liu [EMAIL PROTECTED] wrote:
Yes, i use 1.2my
ok,,thks nick,,,i just forget replace jar file..
wait a minute i will test speed...
2007/6/27, Nick Jenkin [EMAIL PROTECTED]:
http://nickjenkin.com/misc/apache-solr-1.2.0-php-serialize.tar.gz
Try that
-Nick
On 6/27/07, James liu [EMAIL PROTECTED] wrote:
i use tomcat ,, send ur solr
in SolrIndexSearcher, and i no change it.
2007/6/25, James liu [EMAIL PROTECTED]:
I means how to add it to my solr(1.2 production)
2007/6/25, James liu [EMAIL PROTECTED]:
aha,,it seems good, how can i fix it with my solr, i don't know how do
with it
2007/6/25, Nick Jenkin [EMAIL PROTECTED
thks Yonik,and
[quote]how can i use index all with ram and how to config which ram i should
use?[/quote]
which files i should change from source?
and if i change ok.
how to compile? just ant dist?
--
regards
jl
be interested in:
http://issues.apache.org/jira/browse/SOLR-196
-Nick
On 6/25/07, James liu [EMAIL PROTECTED] wrote:
which files i should change from source?
and if i change ok.
how to compile? just ant dist?
--
regards
jl
--
regards
jl
I means how to add it to my solr(1.2 production)
2007/6/25, James liu [EMAIL PROTECTED]:
aha,,it seems good, how can i fix it with my solr, i don't know how do
with it
2007/6/25, Nick Jenkin [EMAIL PROTECTED]:
Hi James
I think you would be better of outputting an PHP array, and running
aha,,same question i found few days ago.
i m sorry to forget submit it.
2007/6/22, Yonik Seeley [EMAIL PROTECTED]:
On 6/21/07, Ryan McKinley [EMAIL PROTECTED] wrote:
I just started running the scripts and
The commit script seems to run fine, but it says there was an error. I
looked into
If just one master or one slave server fail, i think u maybe can use master
index server.
shell controlled by program is easy for me. i use php and shell_exec.
2007/6/21, Otis Gospodnetic [EMAIL PROTECTED]:
Right, that SAN con 2 Masters sounds good. Lucky you with your lonely
Master!
I see SOLR-215 from this mail.
Does it now really support multi index and search it will return merged
data?
for example:
i wanna search: aaa, and i have index1, index2, index3, index4it should
return the result from index1,index2,index3, index4 and merge result by
score, datetime, or
solr:1.2
curl http://192.168.7.6:8080/solr0/update --data-binary
'deletequerynodeid:20/query/delete'
i remember it is ok when i use solr 1.1
does it change?
it show me:
HTTP Status 400 - missing content stream
--
*type* Status report
*message* *missing content
It seems strange when i refresh same url search.
time will change...sometime use *0.01021409034729 s, *sometime use *
0.0080091953277588 s.
*sometime use *0.024219989776611.
It change too big.
*
Only i use it and less search, so i think memory not all use.
why time changed very big, and i
ok, i find it only happen in win.
2007/6/19, James liu [EMAIL PROTECTED]:
It seems strange when i refresh same url search.
time will change...sometime use *0.01021409034729 s, *sometime use *
0.0080091953277588 s.
*sometime use *0.024219989776611 .
It change too big.
*
Only i use
for example.
i wanna show keyword: a and facet sid: 2
my url:
http://localhost:8080/solr1/select?q=a+sid:2start=0rows=10fl=*wt=json
but it show me count bigger than facetnum.
i read http://lucene.apache.org/java/docs/queryparsersyntax.html
and try server way , all not effect.
maybe someone
thks.
2007/6/17, Yonik Seeley [EMAIL PROTECTED]:
On 6/16/07, James liu [EMAIL PROTECTED] wrote:
i wanna show keyword: a and facet sid: 2
my url:
http://localhost:8080/solr1/select?q=a+sid:2start=0rows=10fl=*wt=json
but it show me count bigger than facetnum.
'+' in a URL is like a space
i write script to get run time to sure how to performance.
i find very intresting thing that i query 2 solr box to get data and solr
response show me qtime all zero.
but i find multi get data script use time is 0.046674966812134(it will
change)
solr box in my pc. and index data is very small.
is it ok?
2007/6/14, vanderkerkoff [EMAIL PROTECTED]:
Hi Yonik
Here's the output from netcat
POST /solr/update HTTP/1.1
Host: localhost:8983
Accept-Encoding: identity
Content-Length: 83
Content-Type: text/xml; charset=utf-8
that looks Ok to me, but I am a bit twp you see.
:-)
Yonik
2007/6/14, Yonik Seeley [EMAIL PROTECTED]:
On 6/14/07, James liu [EMAIL PROTECTED] wrote:
i write script to get run time to sure how to performance.
i find very intresting thing that i query 2 solr box to get data and
solr
response show me qtime all zero.
but i find multi get data script
if u use jetty, u should see jetty's log.
if u use tomcat, u should see tomcat's log.
solr is only a program that run with container.
2007/6/15, Ryan McKinley [EMAIL PROTECTED]:
what version of solr/container are you running?
this sounds similar to what people running solr 1.1 with the
anyone agree?
Next solr's development 's plan is? anyone know?
--
regards
jl
2007/6/7, Yonik Seeley [EMAIL PROTECTED]:
On 6/6/07, James liu [EMAIL PROTECTED] wrote:
anyone agree?
No ;-)
At least not if you mean using map-reduce for queries.
When I started looking at distributed search, I immediately went and
read the map-reduce paper (easier concept than it first
thks Solr Committers
--
regards
jl
solr 1.3dev 2007-06-04(svn)
tomcat log show me error information:
solr 1.3dev 2007-06-04
org.apache.solr.core.SolrException: Unknown fieldtype 'string'
i find it only use in shema.xml
field name=id type=string indexed=true stored=true
required=true /
i modifiy it and now start is
thks,ryan, i find required in changes.txt
2007/6/4, Ryan McKinley [EMAIL PROTECTED]:
i modifiy it and now start is ok
field name=id type=integer stored=true /
property required means?
i not find it in comment.
required means that the field *must* be specified when you add
2007/5/29, Chris Hostetter [EMAIL PROTECTED]:
: facet.analyzer is true, do analyze, if false don't analyze.
: What if Solr doesn't have access to the unindexed version? My
: suggestion would be to copyField into an unanalyzed version, and
: facet on that.
me too.
yeah, i'm not even sure
facet.analyzer is true, do analyze, if false don't analyze.
why i say that, Chinese word not use space to split, so if analyzed, it will
change.
now i will use map to fix it before no facet.analyzer.
--
regards
jl
i find it always happen when index have been doing for a while.
for example, it will happen after starting index 1 hour - 2hours.
2007/5/24, James liu [EMAIL PROTECTED]:
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.5.0_07-b01
2007/5/25, Chris Hostetter [EMAIL PROTECTED]:
: when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
: diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
1) bug reports about errors are nearly useless without a real error
message including a stack trace.
multi layer:
now solr's procedure:
user query - solr instance - show results
i think it maybe simple to some application
maybe this procedure fit for:
user query - Master solr query instance - single solr query instance -
show results
master solr query instance:
it can define some global
how do u sure ur file is encoded by utf-8?
2007/5/24, Ethan Gruber [EMAIL PROTECTED]:
Hi,
I am attempting to post some unicode XML documents to my solr index. They
are encoded in UTF-8. When I attempt to query from the solr admin page,
I'm
basically getting gibberish garbage text in return.
clear here
Thanks
-Amit
James liu wrote:
first u try enable highlighting(
http://wiki.apache.org/solr/HighlightingParameters)
and u try solr admin gui to see its output and u will find what u wanna.
2007/5/23, solruser [EMAIL PROTECTED]:
Hi,
I am wondering can we get the list of all
ahait is wonderful.
2007/5/24, Mike Austin [EMAIL PROTECTED]:
Just one.
-Original Message-
From: James liu [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 16, 2007 10:30 PM
To: solr-user@lucene.apache.org
Subject: Re: PriceJunkie.com using solr!
how many solr instance?
2007/5
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
Reindex with solr box which have problem with write block.
it show me well.
it happen serveral times, so i wanna know why
first u try enable highlighting(
http://wiki.apache.org/solr/HighlightingParameters)
and u try solr admin gui to see its output and u will find what u wanna.
2007/5/23, solruser [EMAIL PROTECTED]:
Hi,
I am wondering can we get the list of all highlighted terms from the
search
query. If
the attach is json_encode string which have @
now i find it is PHP JSON function bug because it happen not only when i use
the encode string have @.
now i use JSON_PHP class.(http://mike.teczno.com/json.html)
test code (PHP CODE):
?php
require('json.php');
$json = new Services_JSON();
$text =
if u get null from json_decode($data),,,maybe ur $data have '@'fix way
is replace it before u do json_decode
i try json_encode with php and json_decode with phpit is no problem when
i use '@'
maybe it only happen encode(by java) and decode(by php)
--
regards
jl
2007/5/18, Chris Hostetter [EMAIL PROTECTED]:
:
: if u get null from json_decode($data),,,maybe ur $data have '@'fix
way
: is replace it before u do json_decode
:
: i try json_encode with php and json_decode with phpit is no problem
when
: i use '@'
:
: maybe it only happen encode(by
how many solr instance?
2007/5/17, Yonik Seeley [EMAIL PROTECTED]:
Congrats, very nice job!
It's fast too.
-Yonik
On 5/16/07, Mike Austin [EMAIL PROTECTED] wrote:
I just wanted to say thanks to everyone for the creation of solr. I've
been
using it for a while now and I have recently
if use multi index box, how to pagination with sort by score correctly?
for example, i wanna query search with 60 index box and sort by score.
i don't know the num found from every index box which have different
content.
if promise 10 page with sort score correctly, i think solr 's start is 0,
2007/5/15, Mike Klaas [EMAIL PROTECTED]:
On 14-May-07, at 1:35 AM, James liu wrote:
if use multi index box, how to pagination with sort by score
correctly?
for example, i wanna query search with 60 index box and sort by
score.
i don't know the num found from every index box which have
if i set rows=(page-1)*10,,,it will lose more result which fits query.
how to set start when pagination.
2007/5/15, James liu [EMAIL PROTECTED]:
2007/5/15, Mike Klaas [EMAIL PROTECTED]:
On 14-May-07, at 1:35 AM, James liu wrote:
if use multi index box, how to pagination with sort
Klaas [EMAIL PROTECTED]:
On 14-May-07, at 6:49 PM, James liu wrote:
2007/5/15, Mike Klaas [EMAIL PROTECTED]:
On 14-May-07, at 1:35 AM, James liu wrote:
When you get up to 60 partitions, you should make it a multi stage
process. Assuming your partitions are disjoint and evenly
distributed
for example, i wanna query lucene, it's numFound is 234300.
and results should sorted by score.
if u do, how to pagination and sort it's score?
2007/5/15, Mike Klaas [EMAIL PROTECTED]:
On 14-May-07, at 7:15 PM, James liu wrote:
if i set rows=(page-1)*10,,,it will lose more result which
2007/5/15, Mike Klaas [EMAIL PROTECTED]:
On 14-May-07, at 8:55 PM, James liu wrote:
thks for your detail answer.
but u ignore sorted by score
p1, p2,p1,p1,p3,p4,p1,p1
maybe their max score is lower than from p19,p20.
I'm not ignoring it: I'm implying that the above is the correct
by the looks, thus your
delete query must of deleted everything. That would be why you are
getting no results.
-Nick
On 5/10/07, James liu [EMAIL PROTECTED] wrote:
i use command like this
curl http://localhost:8983/solr/update --data-binary
'deletequeryname:DDR/query/delete'
curl http
get it. thks yonik.
2007/5/10, Yonik Seeley [EMAIL PROTECTED]:
On 5/10/07, Ajanta Phatak [EMAIL PROTECTED] wrote:
I believe in lucene at least deleting documents only marks them for
deletion. The actual delete happens only after closing the IndexReader.
Not sure about Solr
Closing an
u should know id is unique number.
2007/5/11, David Xiao [EMAIL PROTECTED]:
Hello all,
I have tested by use post.sh in example directory to add xml documents
into solr. It works when I add one by one.
But when I have a lot of .xml file to be posted (say about 500-1000 files)
and I wrote a
i use command like this
curl http://localhost:8983/solr/update --data-binary
'deletequeryname:DDR/query/delete'
curl http://localhost:8983/solr/update --data-binary 'commit/'
and i get
numDocs : 0
maxDoc : 1218819
when i search something which exists in before delete and find nothing.
aha,,i Just wanna show the count.
thks, Hoss
2007/5/8, Chris Hostetter [EMAIL PROTECTED]:
: if use 100 facets,,,it will cost more than use 10 facet.
you can't show the top 10 unless you calculate the count for all of them.
if you are using facet.field, Solr is already computing the
str name=facet.fieldforumname/str
str name=facet.queryforumname:娱乐/str
/lst
2007/5/8, James liu [EMAIL PROTECTED]:
aha,,i Just wanna show the count.
thks, Hoss
2007/5/8, Chris Hostetter [EMAIL PROTECTED]:
: if use 100 facets,,,it will cost more than use 10 facet
i use freebsd(csh),,and use cmd like
/tmp/*tomcat*/bin/startup.sh
if u use
./apache-tomcat-5.5.20/bin/startup.sh
u maybe
chmod +x ./*tomcat*/bin/startup.sh
or
sh ./*tomcat*/bin/startup.sh
i have 15 instances in one box(use tomcat)
2007/5/9, Teruhiko Kurosaka [EMAIL
aha,,, i find it used by cnet.com
url: http://shopper.cnet.com/4566-6501_9-0.html?tag=stbc.gp
and http://shopper.cnet.com/4566-6501_9-0.html?sa=136tag=stbc.gp
how does it do?
2007/5/8, James liu [EMAIL PROTECTED]:
for example, my facet queries have 100.
i wanna show top 10 facet
if use 100 facets,,,it will cost more than use 10 facet.
so i think it maybe use two facet style. or facet's rule.
if it like what you said,,i think i know hot to do.
thks,Mike
2007/5/8, Mike Klaas [EMAIL PROTECTED]:
On 5/7/07, James liu [EMAIL PROTECTED] wrote:
for example, my facet
101 - 200 of 285 matches
Mail list logo