I was looking for answer to the same question, and have similar concern. Looks
like any serious customization work requires developing custom SearchComponent,
but it's not clear to me how Solr designer wanted this to be done. I have more
confident to either do it at Lucene level, or stay on cli
trying to manipulate search result (like further filtering out unwanted), and
ordering the results differently. Where is the suitable place for doing it?
I've been using QueryResponseWriter but that doesn't seem to be the right place.
thanks.
___
Up to your solr client.
On Mon, Nov 24, 2008 at 1:24 PM, souravm <[EMAIL PROTECTED]> wrote:
> Hi,
>
> Looking for some insight on distributed search.
>
> Say I have an index distributed in 3 boxes and the index contains time and
> text data (typical log file). Each box has index for different tim
first u sure the xml is utf-8,,and field value is utf-8,,
second u should post xml by utf-8
my advice : All encoding use utf-8...
it make my solr work well,,, i use chinese
--
regards
j.L
check procedure:
1: rm -r $tomcat/webapps/*
2: rm -r $solr/data ,,,ur index data directory
3: check xml(any xml u modified)
4: start tomcat
i had same error, but i forgot how to fix...so u can use my check procedure,
i think it will help you
i use tomcat+solr in win2003, freebsd, mac osx 10.5.5,
i find url not same as the others
--
regards
j.L
first, u should escape some string like (code by php)
> function escapeChars($string) {
>
$string = str_replace("&", "&", $string);
$string = str_replace("<", "<", $string);
$string = str_replace(">", ">", $string);
$string = str_replace("'", "'", $string);
$string = str_replace('"', """, $str
ecific reason why the CJK analyzers in Solr
> >were > chosen to be >> n-gram based instead of
> >it being a morphological analyzer which is >
> >kind of >> implemented in Google as it
> >considered to be more effective than the >
> >n-gram >> ones? &g
lly be indexing millions of documents.
>
> James,
>
> We would have a look at hylanda too. What abt japanese and korean
> analyzers,
> any recommendations?
>
> - Eswar
>
> On Nov 27, 2007 7:21 AM, James liu <[EMAIL PROTECTED]> wrote:
>
> > I don'
if ur analyzer is standard, u can try use tokenize.(u can find the answer
from analyzer source code and schema.xml)
On Nov 27, 2007 9:39 AM, zx zhang <[EMAIL PROTECTED]> wrote:
> lance,
>
> The following is a instance schema fieldtype using solr1.2 and CJK
> package.
> And it works. As you said,
I don't think NGram is good method for Chinese.
CJKAnalyzer of Lucene is 2-Gram.
Eswar K:
if it is chinese analyzer,,i recommend hylanda(www.hylanda.com),,,it is
the best chinese analyzer and it not free.
if u wanna free chinese analyzer, maybe u can try je-analyzer. it have
some problem when
if u use tomcat,,,it default port: 8080 and other default port.
so u just use other tomcat which use 8181 and other port...(i remember u
should modify three port(one tomcat) )
I used to have four tomcat in One SERVER.
On Nov 9, 2007 7:39 AM, Isart Montane <[EMAIL PROTECTED]> wrote:
> Hi all,
>
if I understand correct,,u just do it like that:(i use php)
$data1 = getDataFromInstance1($url);
$data2 = getDataFromInstance2($url);
it just have multi solr Instance. and getData from the distance.
On Nov 12, 2007 11:15 PM, Dilip.TS <[EMAIL PROTECTED]> wrote:
> Hello,
>
> Does SOLR supports
Thks everybody who give me help.
especial Dave, thk u.
On Nov 8, 2007 11:21 AM, James liu <[EMAIL PROTECTED]> wrote:
> hmm
>
> i find error,,,that is my error not about php and phps ..
>
> i use old config to testso config have a problem..
>
> that is Title i
hmm
i find error,,,that is my error not about php and phps ..
i use old config to testso config have a problem..
that is Title i use double as its type...it should use text.
On Nov 8, 2007 10:29 AM, James liu <[EMAIL PROTECTED]> wrote:
> php now is ok..
>
>
;q";s:1:"2";s:2:"wt";s:4:"phps";s:4:"rows";a:2:{i:0;s:1:"2";i:1;s:2:"10";}s:7:"version";s:3:"
> 2.2";}}s:8:"response";a:3:{s:8:"numFound";i:28;s:5:"start";i:0;s:4:"docs";a:2:{i
i just decrease answer information...and u will see my result(full, not
part)
*before unserialize*
> string(433)
> "a:2:{s:14:"responseHeader";a:3:{s:6:"status";i:0;s:5:"QTime";i:0;s:6:"params";a:7:{s:2:"fl";s:5:"Title";s:6:"indent";s:2:"on";s:5:"start";s:1:"0";s:1:"q";s:1:"2";s:2:"wt";s:4:"ph
same answer.
On Nov 7, 2007 11:41 AM, James liu <[EMAIL PROTECTED]> wrote:
> afternoon,,i will update svn...and try the newest...
>
>
>
>
> On Nov 7, 2007 11:23 AM, Dave Lewis <[EMAIL PROTECTED]> wrote:
>
> >
> > On Nov 6, 2007, at 8:10 PM, James li
afternoon,,i will update svn...and try the newest...
On Nov 7, 2007 11:23 AM, Dave Lewis <[EMAIL PROTECTED]> wrote:
>
> On Nov 6, 2007, at 8:10 PM, James liu wrote:
>
> > first var_dump result(part not all):
> >
> > string(50506)
> >> "a:2:{
t;2";s:2:"wt";s:4:"phps";s:4:"rows";s:2:"10";s:7:"version";s:3:"
> 2.2";}}
>
two var_dump result:
bool(false)
On Nov 6, 2007 10:36 PM, Dave Lewis <[EMAIL PROTECTED]> wrote:
> What are the results of the two var_dumps?
p($a);
$a = unserialize($a);
echo 'after unserialize...';
var_dump($a);
?>*
*
On 11/6/07, Stu Hood <[EMAIL PROTECTED]> wrote:
>
> Did you enable the PHP serialized response writer in your solrconfig.xml?
> It is not enabled by default.
>
> Thanks,
> Stu
>
>
i know it...but u try it,,u will find simlar question.
On 11/5/07, Robert Young <[EMAIL PROTECTED]> wrote:
>
> I would imagine you have to unserialize
>
> On 11/5/07, James liu <[EMAIL PROTECTED]> wrote:
> > i find they all return string
> >
> > >
i find they all return string
http://localhost:8080/solr/select/?q=solr&version=2.2&start=0&rows=10&indent=on&wt=php
';
var_dump(file_get_contents($url);
?>
--
regards
jl
if u rebuild solr , safe method is rm -r *tomcat*/webapps/*.
2007/11/1, Chris Hostetter <[EMAIL PROTECTED]>:
>
>
> : Is there an easy to find out which version of solr is running. I
> installed
> : solr 1.2 and set up an instance using Tomcat. It was successful before.
>
> FYI: starting a while b
where i can read 1.3 new features?
2007/10/26, Venkatraman S <[EMAIL PROTECTED]>:
>
> On 10/26/07, Mike Klaas <[EMAIL PROTECTED]> wrote:
> >
> > If we did a 1.2.x, it shoud (imo) contain no new features, only
> > important bugfixes.
>
>
> I have been having a look at the trunk for quite sometime n
i find it happen when it do commit.
i use solr 1.2 release.
i use crontab to do index work.
2007/10/15, James liu <[EMAIL PROTECTED]>:
>
> i have 40 instances,,,one instance lost segments* file(happen after commit
> and optimize)
>
> anyone have similar problem?
>
i have 40 instances,,,one instance lost segments* file(happen after commit
and optimize)
anyone have similar problem?
can i fix this problem?
can i recovery this instance data?
--
regards
jl
there.
>
> Otis
>
> --
>
> Lucene - Solr - Nutch - Consulting -- http://sematext.com/
>
>
>
>
> - Original Message
> From: James liu <[EMAIL PROTECTED]>
> To: solr-user@lucene.apache.org
> Sent: Tuesday, October 9, 2007 11:15:56 PM
> Subject: i
i just wanna know is it exist which can decrease index size,,not by
increasing hardware or optimizing lucene params.
--
regards
jl
* *
*i think text not need "stored='true'" unless u will show it.(it will help u
decrease index size and not affect search )*
*index and search use same box? if it is true, u should moniter search
response time when indexing.(include CPU, RAM change)*
*i have similar problem and i increase JVM s
i can't download it from http://jetty.mortbay.org/jetty5/plus/index.html
--
regards
jl
if use multi solr with one index, it will cache individually.
so i think can it share their cache.(they have same config)
--
regards
jl
to accomplish.
>
> Thanks,
> Grant
>
> On Sep 23, 2007, at 10:38 AM, James liu wrote:
>
> > i wanna do it.
> >
> > Maybe someone did it, if so, give me some tips.
> >
> > thks
> >
> > --
> > regards
> > jl
>
> ---
to see them immediately, or just the current user?
> >
> > We can better help you if you give us more details on what you are
> > trying to accomplish.
> >
> > Thanks,
> > Grant
> >
> > On Sep 23, 2007, at 10:38 AM, Jam
i wanna do it.
Maybe someone did it, if so, give me some tips.
thks
--
regards
jl
thks ,ryan.
2007/9/10, Ryan McKinley <[EMAIL PROTECTED]>:
>
> James liu wrote:
> > i wanna try patch:
> >
> https://issues.apache.org/jira/browse/SOLR-139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel
> >
> > and i download solr1.2 r
i wanna try patch:
https://issues.apache.org/jira/browse/SOLR-139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel
and i download solr1.2 release
patch < SOLR-269*.pach(when in
'/tmp/apache-solr-1.2.0/src/test/org/apache/solr/update'
)
it show me
|Index: src/test/org/apache
OK...I see...thk u ,mike.
2007/8/31, Mike Klaas <[EMAIL PROTECTED]>:
>
>
> On 29-Aug-07, at 10:21 PM, James liu wrote:
>
> > Does it affect with doc size?
> >
> > for example 2 billion docs, 10k doc2 billion docs, but doc size
> > is 10m.
>
>
Does it affect with doc size?
for example 2 billion docs, 10k doc2 billion docs, but doc size is 10m.
2007/8/30, Mike Klaas <[EMAIL PROTECTED]>:
>
> 2 billion docs (signed int).
>
> On 29-Aug-07, at 6:24 PM, James liu wrote:
>
> > what is the limits for Lucene an
what is the limits for Lucene and Solr.
100m, 1000m, 5000m or other number docs?
2007/8/24, Walter Underwood <[EMAIL PROTECTED]>:
>
> It should work fine to index them and search them. 13 million docs is
> not even close to the limits for Lucene and Solr. Have you had problems?
>
> wunder
>
> On
Lucene is a search library, Solr is a search server that uses
> Lucene.
>
> Cheers,
> Grant
>
> On Aug 8, 2007, at 2:57 AM, James liu wrote:
>
> > if i wanna calc it by my method, something i should notice ?
> >
> > anyone did it?
> >
> >
> &g
if i wanna calc it by my method, something i should notice ?
anyone did it?
--
regards
jl
fieldset "topic" indexed='false' and stored='true'
i don't know why it will be analyzed?
now i wanna it only store not analyzed,,,how can i do?
--
regards
jl
I correct it,,,i index 17M docs. not 1.7M,,,so OutOfMemory happen when it
finish index ~11.3m docs
It is new index.
i think it maybe the reason:
On 7/18/07, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
> Why? Too small of a Java heap. :)
> Increase the size of the Java heap and lower the maxBu
when i index 1.7m docs and 4k-5k per doc.
OutOfMemory happen when it finish index ~1.13m docs
I just restart tomcat , delete all lock and restart do index.
No error or warning infor until it finish.
anyone know why? or have the same error?
--
regards
jl
2007/7/18, Ryan McKinley <[EMAIL PROTECTED]>:
Xuesong Luo wrote:
> Hi, there,
> We have one master server and multiple slave servers. The multiple slave
> servers can be run either on the same box or different boxes. For
> slaves on the same box, is there any best practice that they should use
u can find configuration datadir in solrconfig.xml(solr 1.2)
2007/7/10, nithyavembu <[EMAIL PROTECTED]>:
Hi,
I tried as you said and got the result without any error. So we can make
the solr home anywhere. But we have to give the path correctly in solr.xml
.
Am i correct?
Now i am one step f
I use freebsd.
2007/6/16, Yonik Seeley <[EMAIL PROTECTED]>:
On 6/14/07, James liu <[EMAIL PROTECTED]> wrote:
> I just timing my script to get data from 2 solr boxes, not complete
script.
> It just query two box and return id,score .rows=10. response type use
json.
>
&g
ized
output from solr) result so I can test?
Thanks
-Nick
On 6/28/07, James liu <[EMAIL PROTECTED]> wrote:
> code not change,,,and i not use utf8_decodeshould do it?
>
> 2007/6/28, Nick Jenkin <[EMAIL PROTECTED]>:
> >
> > Hi James
> > It is totally no
i means define it in schema.xml,,,
--
regards
jl
code not change,,,and i not use utf8_decodeshould do it?
2007/6/28, Nick Jenkin <[EMAIL PROTECTED]>:
Hi James
It is totally not optimized, when you say change your content into
???, I assume this is because of UTF8 issues, are you using
utf8_decode etc?
Thanks
-Nick
On 6/28/07, Jam
It is slower than json and xml,,,and it will change my content into ???
when i use json , content is ok.
afternoon, iwill read ur code.
2007/6/27, James liu <[EMAIL PROTECTED]>:
ok,,thks nick,,,i just forget replace jar file..
wait a minute i will test speed...
2007/6/27, Nick
ok,,thks nick,,,i just forget replace jar file..
wait a minute i will test speed...
2007/6/27, Nick Jenkin <[EMAIL PROTECTED]>:
http://nickjenkin.com/misc/apache-solr-1.2.0-php-serialize.tar.gz
Try that
-Nick
On 6/27/07, James liu <[EMAIL PROTECTED]> wrote:
> i use tomcat
i use tomcat ,, send ur solr version to me...i try it again..
2007/6/27, Nick Jenkin <[EMAIL PROTECTED]>:
If you are using the example provided in 1.2 (using jetty) you need to
use "ant example"
rather than "ant dist"
-Nick
On 6/27/07, James liu <[EMAIL PROTEC
how about its performance?
2007/6/26, Kijiji Xu, Ping <[EMAIL PROTECTED]>:
I had solved this problem,below is my POST code,I used HTTP_Request of
PEAR,it's so simple.thank you all very much .FYI;
private function doPost($url,$postData){
$req = &new HTTP_Request($url,array(
'm
XML data should bigger thant JSON data, and transfer quicker than JSON..
it surprised me.
2007/6/27, Yonik Seeley <[EMAIL PROTECTED]>:
It would be helpful if you could try out the patch at
https://issues.apache.org/jira/browse/SOLR-276
-Yonik
On 6/26/07, Yonik Seeley <[EMAIL PROTECTED]> wro
very strange ,only me fail? anyone have same question?
if free, maybe u zip your solr to me by mail...and i try it again.
2007/6/26, Nick Jenkin <[EMAIL PROTECTED]>:
Interesting, what version of solr are you using, I tested on 1.2.
-Nick
On 6/26/07, James liu <[EMAIL PROTECTED]> wro
2007/6/27, Mike Klaas <[EMAIL PROTECTED]>:
On 25-Jun-07, at 10:53 PM, James liu wrote:
>
> [quote]how can i use index all with ram and how to config which ram
> i should
> use?[/quote]
Your os will automatically load the most frequently-used parts of the
index in ram.
If
first try it? which system u use?
if u use freebsd, just give up trying. it not fit for freebsd.
2007/6/27, Otis Gospodnetic <[EMAIL PROTECTED]>:
Hi,
Here is a puzzling one. I can't get Solr to invoke snaphooter
properly. Solr claims my snapshooter is not where I said it is:
SEVERE: java.
<[EMAIL PROTECTED]>:
I have some good news :o)
https://issues.apache.org/jira/browse/SOLR-275
Please let me know if you find any bugs
Thanks
-Nick
On 6/26/07, James liu <[EMAIL PROTECTED]> wrote:
> I think it simple to u.
>
> so i wait for ur good news.
>
> 200
thks Yonik,and
[quote]how can i use index all with ram and how to config which ram i should
use?[/quote]
t; > Hi James
> > > I think you would be better of outputting an PHP array, and running
> > > eval() over it, the PHP serialize format is quite complicated.
> > >
> > > On that note, you might be interested in:
> > > http://issues.apache.org/
10m docs and 4k/doc, 1m docs and 40k/doc
which will fast in same environment?
--
regards
jl
for example, i wanna sort by datetime, does it have to be store='true', and
i wanna define it
am i right?
if right, iwanna define score like that and how to define it or maybe it was
if my field all use index=true, stored=false, does it means low disk
io and more ram used?
how can i use it
SolrIndexSearcher, and i no change it.
2007/6/25, James liu <[EMAIL PROTECTED]>:
I means how to add it to my solr(1.2 production)
2007/6/25, James liu <[EMAIL PROTECTED]>:
>
> aha,,it seems good, how can i fix it with my solr, i don't know how do
> with it
>
>
&
I means how to add it to my solr(1.2 production)
2007/6/25, James liu <[EMAIL PROTECTED]>:
aha,,it seems good, how can i fix it with my solr, i don't know how do
with it
2007/6/25, Nick Jenkin <[EMAIL PROTECTED]>:
>
> Hi James
> I think you would be better of ou
ou might be interested in:
http://issues.apache.org/jira/browse/SOLR-196
-Nick
On 6/25/07, James liu <[EMAIL PROTECTED]> wrote:
> which files i should change from source?
>
> and if i change ok.
>
> how to compile? just ant dist?
>
> --
> regards
> jl
>
--
regards
jl
which files i should change from source?
and if i change ok.
how to compile? just ant dist?
--
regards
jl
aha,,same question i found few days ago.
i m sorry to forget submit it.
2007/6/22, Yonik Seeley <[EMAIL PROTECTED]>:
On 6/21/07, Ryan McKinley <[EMAIL PROTECTED]> wrote:
> I just started running the scripts and
>
> The commit script seems to run fine, but it says there was an error. I
> looke
aha,,sorry,i miss it.
2007/6/21, Chris Hostetter <[EMAIL PROTECTED]>:
: curl http://192.168.7.6:8080/solr0/update --data-binary
: 'nodeid:20'
:
: i remember it is ok when i use solr 1.1
...
: HTTP Status 400 - missing content stream
please note the "Upgrading from Solr 1.1" section o
solr:1.2
curl http://192.168.7.6:8080/solr0/update --data-binary
'nodeid:20'
i remember it is ok when i use solr 1.1
does it change?
it show me:
HTTP Status 400 - missing content stream
--
*type* Status report
*message* *missing content stream*
*description* *The
I see SOLR-215 from this mail.
Does it now really support multi index and search it will return merged
data?
for example:
i wanna search: aaa, and i have index1, index2, index3, index4it should
return the result from index1,index2,index3, index4 and merge result by
score, datetime, or other
If just one master or one slave server fail, i think u maybe can use master
index server.
shell controlled by program is easy for me. i use php and shell_exec.
2007/6/21, Otis Gospodnetic <[EMAIL PROTECTED]>:
Right, that SAN con 2 Masters sounds good. Lucky you with your lonely
Master! Wh
ok, i find it only happen in win.
2007/6/19, James liu <[EMAIL PROTECTED]>:
It seems strange when i refresh same url search.
time will change...sometime use *0.01021409034729 s, *sometime use *
0.0080091953277588 s.
*sometime use *0.024219989776611 .
It change too big.
*
Only i use
It seems strange when i refresh same url search.
time will change...sometime use *0.01021409034729 s, *sometime use *
0.0080091953277588 s.
*sometime use *0.024219989776611.
It change too big.
*
Only i use it and less search, so i think memory not all use.
why time changed very big, and i thi
thks.
2007/6/17, Yonik Seeley <[EMAIL PROTECTED]>:
On 6/16/07, James liu <[EMAIL PROTECTED]> wrote:
> i wanna show keyword: a and facet sid: 2
>
> my url:
>
http://localhost:8080/solr1/select?q=a+sid:2&start=0&rows=10&fl=*&wt=json
>
> but it show m
for example.
i wanna show keyword: a and facet sid: 2
my url:
http://localhost:8080/solr1/select?q=a+sid:2&start=0&rows=10&fl=*&wt=json
but it show me count bigger than facetnum.
i read http://lucene.apache.org/java/docs/queryparsersyntax.html
and try server way , all not effect.
maybe some
maybe u will find it in *apache-solr-1.2.0\example\logs*
and I not use jetty.
2007/6/15, Jack L <[EMAIL PROTECTED]>:
Yeah, I'm running 1.1 with jetty.
But I didn't find *.log in the whole solr directory.
Is jetty putting the log files outside the directory?
> what version of solr/container a
if u use jetty, u should see jetty's log.
if u use tomcat, u should see tomcat's log.
solr is only a program that run with container.
2007/6/15, Ryan McKinley <[EMAIL PROTECTED]>:
what version of solr/container are you running?
this sounds similar to what people running solr 1.1 with the je
2007/6/14, Yonik Seeley <[EMAIL PROTECTED]>:
On 6/14/07, James liu <[EMAIL PROTECTED]> wrote:
> i write script to get run time to sure how to performance.
>
> i find very intresting thing that i query 2 solr box to get data and
solr
> response show me qtime all zero.
is it ok?
2007/6/14, vanderkerkoff <[EMAIL PROTECTED]>:
Hi Yonik
Here's the output from netcat
POST /solr/update HTTP/1.1
Host: localhost:8983
Accept-Encoding: identity
Content-Length: 83
Content-Type: text/xml; charset=utf-8
that looks Ok to me, but I am a bit twp you see.
:-)
Yonik Seel
i write script to get run time to sure how to performance.
i find very intresting thing that i query 2 solr box to get data and solr
response show me qtime all zero.
but i find multi get data script use time is 0.046674966812134(it will
change)
solr box in my pc. and index data is very small.
2007/6/7, Yonik Seeley <[EMAIL PROTECTED]>:
On 6/6/07, James liu <[EMAIL PROTECTED]> wrote:
> anyone agree?
No ;-)
At least not if you mean using map-reduce for queries.
When I started looking at distributed search, I immediately went and
read the map-reduce paper (easier
anyone agree?
Next solr's development 's plan is? anyone know?
--
regards
jl
thks,ryan, i find "required" in changes.txt
2007/6/4, Ryan McKinley < [EMAIL PROTECTED]>:
>
> i modifiy it and now start is ok
>
>>
>>
>
> property required means?
> i not find it in comment.
>
"required" means that the field *must* be specified when you add it to
the index. If it i
solr 1.3dev 2007-06-04(svn)
tomcat log show me error information:
solr 1.3dev 2007-06-04
org.apache.solr.core.SolrException: Unknown fieldtype 'string'
i find it only use in shema.xml
i modifiy it and now start is ok
property required means?
i not find it in comment.
--
thks Solr Committers
--
regards
jl
2007/5/29, Chris Hostetter <[EMAIL PROTECTED]>:
: > facet.analyzer is true, do analyze, if false don't analyze.
: What if Solr doesn't have access to the unindexed version? My
: suggestion would be to copyField into an unanalyzed version, and
: facet on that.
me too.
yeah, i'm not even su
facet.analyzer is true, do analyze, if false don't analyze.
why i say that, Chinese word not use space to split, so if analyzed, it will
change.
now i will use map to fix it before no facet.analyzer.
--
regards
jl
how do u sure ur file is encoded by utf-8?
2007/5/24, Ethan Gruber <[EMAIL PROTECTED]>:
Hi,
I am attempting to post some unicode XML documents to my solr index. They
are encoded in UTF-8. When I attempt to query from the solr admin page,
I'm
basically getting gibberish garbage text in return
multi layer:
now solr's procedure:
user query -> solr instance -> show results
i think it maybe simple to some application
maybe this procedure fit for:
user query -> Master solr query instance -> single solr query instance ->
show results
master solr query instance:
it can define some glo
2007/5/25, Chris Hostetter <[EMAIL PROTECTED]>:
: when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
: diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
1) bug reports about errors are nearly useless without a real error
message including a stack trace
i find it always happen when index have been doing for a while.
for example, it will happen after starting index 1 hour - 2hours.
2007/5/24, James liu <[EMAIL PROTECTED]>:
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
Reindex with solr box which have problem with write block.
it show me well.
it happen serveral times, so i wanna know why i
ahait is wonderful.
2007/5/24, Mike Austin <[EMAIL PROTECTED]>:
Just one.
-Original Message-
From: James liu [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 16, 2007 10:30 PM
To: solr-user@lucene.apache.org
Subject: Re: PriceJunkie.com using solr!
how many solr instance?
myself clear here
Thanks
-Amit
James liu wrote:
>
> first u try enable highlighting(
> http://wiki.apache.org/solr/HighlightingParameters)
>
> and u try solr admin gui to see its output and u will find what u wanna.
>
>
>
> 2007/5/23, solruser <[EMAIL PROTECTED]>:
&
first u try enable highlighting(
http://wiki.apache.org/solr/HighlightingParameters)
and u try solr admin gui to see its output and u will find what u wanna.
2007/5/23, solruser <[EMAIL PROTECTED]>:
Hi,
I am wondering can we get the list of all highlighted terms from the
search
query. If
first u should know ur goal.
second u should analyzer ur search interface which fit for ur customer
third u analyzer ur queries(optimize solr with more used queries)
40 Threads /s does it mean u use 40 solr instances or it just show higher
user queries?
2007/5/21, Yonik Seeley <[EMAIL PROTECT
the attach is json_encode string which have "@"
now i find it is PHP JSON function bug because it happen not only when i use
the encode string have "@".
now i use JSON_PHP class.(http://mike.teczno.com/json.html)
test code (PHP CODE):
http://bbs.qq.com/cgi-bin/bbs/show/content?club=0&groupid=1
2007/5/18, Chris Hostetter <[EMAIL PROTECTED]>:
:
: if u get null from json_decode($data),,,maybe ur $data have '@'fix
way
: is replace it before u do json_decode
:
: i try json_encode with php and json_decode with phpit is no problem
when
: i use '@'
:
: maybe it only happen encode(by j
101 - 200 of 346 matches
Mail list logo