Re: Disable (or prohibit) per-field overrides

2010-10-17 Thread Markus Jelsma
Hi,

Thanks for the suggestion and pointer. We've implemented it using a single 
regex in Nginx for now. 

Cheers,

 : Anyone knows useful method to disable or prohibit the per-field override
 : features for the search components? If not, where to start to make it
 : configurable via solrconfig and attempt to come up with a working patch?
 
 If your goal is to prevent *clients* from specifying these (while you're
 still allowed to use them in your defaults) then the simplest solution is
 probably something external to Solr -- along the lines of mod_rewrite.
 
 Internally...
 
 that would be tough.
 
 You could probably write a SearchComponent (configured to run first)
 that does it fairly easily -- just wrap the SolrParams in an impl that
 retuns null anytime a component asks for a param name that starts with
 f. (and excludes those param names when asked for a list of the param
 names)
 
 
 It could probably be generalized to support arbitrary rules i na way
 that might be handy for other folks, but it would still just be
 wrapping all of hte params, so it would prevent you from using them
 in your config as well.
 
 Ultimatley i think a general solution would need to be in
 RequestHandlerBase ... where it wraps the request params using the
 defaults and invariants ... you'd want the custom exclusion rules to apply
 only to the request params from the client.
 
 
 
 
 -Hoss


Re: SolrJ new javabin format

2010-10-17 Thread Markus Jelsma
Well, in Nutch we simply replace the two jars and it all still works.

   The CHANGES.txt file in branch_3x says that the javabin format has
 changed in Solr 3.1, so you need to update SolrJ as well as Solr.  Is
 the SolrJ included in 3.1 compatible with both 3.1 and 1.4.1?  If not,
 that's going to make a graceful upgrade of my replicated distributed
 installation a little harder.
 
 Thanks,
 Shawn


Re: How do you programatically create new cores?

2010-10-17 Thread Marc Sturlese

You have to create the core's folder with it's conf inside the Solr home.
Once done you can call the create action of the admin handler:
http://wiki.apache.org/solr/CoreAdmin#CREATE
If you need to dinamically create, start and stop lots of cores there's this
patch, but don't know about it's current state:
http://wiki.apache.org/solr/LotsOfCores

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/How-do-you-programatically-create-new-cores-tp1706487p1718648.html
Sent from the Solr - User mailing list archive at Nabble.com.


query between two date

2010-10-17 Thread nedaha

Hi there,

At first i have to explain the situation. 
I have 2 fields indexed named tdm_avail1 and tdm_avail2 that are arrays of
some different dates.

This is a sample doc


arr name=tdm_avail1
date2010-10-21T08:29:43Z/date
date2010-10-22T08:29:43Z/date
date2010-10-25T08:29:43Z/date
date2010-10-26T08:29:43Z/date
date2010-10-27T08:29:43Z/date
/arr

arr name=tdm_avail2
date2010-10-19T08:29:43Z/date
date2010-10-20T08:29:43Z/date
date2010-10-21T08:29:43Z/date
date2010-10-22T08:29:43Z/date
/arr

And in my search form i have 2 field named check-in date and check-out date.
I want solr to compare the range that user enter in the search form with the
values of tdm_avail1 and tdm_avail2 and return doc if all dates between
check-in and check-out dates matches with tdm_avail1 or tdm_avail2 values. 

for example if user enter:
check-in date: 2010-10-19
check-out date: 2010-10-21
that is match with tdm_avail2 then doc must be returned.

but if user enter:
check-in date: 2010-10-25
check-out date: 2010-10-29
doc could not be returned.

so i want the query that gives me the mentioned result. could you help me
please?

thanks in advance

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/query-between-two-date-tp1718566p1718566.html
Sent from the Solr - User mailing list archive at Nabble.com.


Spanning an index across multiple volumes

2010-10-17 Thread Peter Sturge
Is it possible to get an index to span multiple disk volumes - i.e.
when its 'primary' volume fills up (or optimize needs more room), tell
Solr/Lucene to use a secondary/tertiary/quaternary et al volume?

I've not seen any configuration that would allow this, but maybe
others have a use case for such functionality?

Thanks,
Peter


DIH - configure password in 1 place and store it in encrypted form?

2010-10-17 Thread Arunkumar Ayyavu
Hi!

I have multiple cores reading from the same database and I've provided
the user credentials in all data-config.xml files. Is there a way to
tell JdbcDataSource in data-config.xml to read the username and
password from a file? This would help me not to change the
username/password in multiple data-config.xml files.

And is it possible to store the password in encrypted and let the DIH
to call the decrypter to read the password?

Thanks a lot.

-- 
Arun


Re: Spanning an index across multiple volumes

2010-10-17 Thread Jan Høydahl / Cominvent
Juggling disk volumes does not sound like a logical responsibility for Solr to 
me. Solr/Lucene expects to have enough room to live in.
Better to push this down to the OS level. There are all kinds of logical volume 
managers around which lets you add new disks to the same logical volume, 
achieving what you want. Or if you run on a cloud architecture, you may 
increase disk with a couple of API calls triggered by monitoring...

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

On 17. okt. 2010, at 12.38, Peter Sturge wrote:

 Is it possible to get an index to span multiple disk volumes - i.e.
 when its 'primary' volume fills up (or optimize needs more room), tell
 Solr/Lucene to use a secondary/tertiary/quaternary et al volume?
 
 I've not seen any configuration that would allow this, but maybe
 others have a use case for such functionality?
 
 Thanks,
 Peter



indexing mysql database

2010-10-17 Thread do3do3

i try to index table in mysql database,
1st i create db-config.xml file which contain
dataSource type=JdbcDataSource name=1stTrial
Driver=com.mysql.jdbc.Driver encoding=UTF-8 
url=jdbc:mysql://localhost:3306/(database 
name) 
user=(user) password=(password) 
batchSize=-1/
followed by 
entity dataSource=1stTrial name=(table name) pk=id query=select *
from (table name)
and defining of table like 
field column=id name=ID/
field column=Text1 name=(field name)/
2nd i add this field in schema.xml file
and finally decide in solronfig.xml file the db-config.xml file as 
requestHandler name=/dataimport
class=org.apache.solr.handler.dataimport.DataImportHandler
lst name=defaults
str name=configdb-data-config.xml/str
/lst
  /requestHandler
i found index folder which contain only segment.gen  segment_1 files
and when try to search no result i got
any body can present a help ???
thanks in advance 

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/indexing-mysql-database-tp1719883p1719883.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: indexing mysql database

2010-10-17 Thread William Pierce
Two suggestions:  a) Noticed that your dih spec in the solrconfig.xml seems 
to to refer to db-data-config.xml but you said that your file was 
db-config.xml.   You may want to check this to make sure that your file 
names are correct.  b) what does your log say when you ran the import 
process?


- Bill

-Original Message- 
From: do3do3

Sent: Sunday, October 17, 2010 8:29 AM
To: solr-user@lucene.apache.org
Subject: indexing mysql database


i try to index table in mysql database,
1st i create db-config.xml file which contain
dataSource type=JdbcDataSource name=1stTrial
Driver=com.mysql.jdbc.Driver encoding=UTF-8
url=jdbc:mysql://localhost:3306/(database name)
user=(user) password=(password) batchSize=-1/
followed by
entity dataSource=1stTrial name=(table name) pk=id query=select *
from (table name)
and defining of table like
field column=id name=ID/
field column=Text1 name=(field name)/
2nd i add this field in schema.xml file
and finally decide in solronfig.xml file the db-config.xml file as
requestHandler name=/dataimport
class=org.apache.solr.handler.dataimport.DataImportHandler
   lst name=defaults
   str name=configdb-data-config.xml/str
   /lst
 /requestHandler
i found index folder which contain only segment.gen  segment_1 files
and when try to search no result i got
any body can present a help ???
thanks in advance

--
View this message in context: 
http://lucene.472066.n3.nabble.com/indexing-mysql-database-tp1719883p1719883.html
Sent from the Solr - User mailing list archive at Nabble.com. 



Re: How do you programatically create new cores?

2010-10-17 Thread Tharindu Mathew
Hi Marc, 

Thanks for the reply. 

So as I understand I need to make a http get call with an action parameter set 
to create to dynamically create a core? I do not see an API to do this 
anywhere. 

On Oct 17, 2010, at 3:54 PM, Marc Sturlese marc.sturl...@gmail.com wrote:

 
 You have to create the core's folder with it's conf inside the Solr home.
 Once done you can call the create action of the admin handler:
 http://wiki.apache.org/solr/CoreAdmin#CREATE
 If you need to dinamically create, start and stop lots of cores there's this
 patch, but don't know about it's current state:
 http://wiki.apache.org/solr/LotsOfCores
 
 -- 
 View this message in context: 
 http://lucene.472066.n3.nabble.com/How-do-you-programatically-create-new-cores-tp1706487p1718648.html
 Sent from the Solr - User mailing list archive at Nabble.com.


Re: DIH - configure password in 1 place and store it in encrypted form?

2010-10-17 Thread Gora Mohanty
On Sun, Oct 17, 2010 at 7:02 PM, Arunkumar Ayyavu
arunkumar.ayy...@gmail.com wrote:
 Hi!

 I have multiple cores reading from the same database and I've provided
 the user credentials in all data-config.xml files. Is there a way to
 tell JdbcDataSource in data-config.xml to read the username and
 password from a file? This would help me not to change the
 username/password in multiple data-config.xml files.

 And is it possible to store the password in encrypted and let the DIH
 to call the decrypter to read the password?
[...]

As far as I am aware, it is not possible to do either of the two
options above. However, one could extend the JdbcDataSource
class to add such functionality.

Regards,
Gora


how can i use solrj binary format for indexing?

2010-10-17 Thread Jason, Kim

Hi all
I have a huge amount of xml files for indexing.
I want to index using solrj binary format to get performance gain.
Because I heard that using xml files to index is quite slow.
But I don't know how to use index through solrj binary format and can't find
examples.
Please give some help.
Thanks,
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/how-can-i-use-solrj-binary-format-for-indexing-tp1722612p1722612.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Spanning an index across multiple volumes

2010-10-17 Thread Otis Gospodnetic
Hi,

The closest thing I can think of is FileSwitchDirectory ( 
http://search-lucene.com/?q=FileSwitchDirectory )

Otis

Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/



- Original Message 
 From: Jan Høydahl / Cominvent jan@cominvent.com
 To: solr-user@lucene.apache.org
 Sent: Sun, October 17, 2010 10:15:33 AM
 Subject: Re: Spanning an index across multiple volumes
 
 Juggling disk volumes does not sound like a logical responsibility for Solr 
 to  
me. Solr/Lucene expects to have enough room to live in.
 Better to push this  down to the OS level. There are all kinds of logical 
volume managers around  which lets you add new disks to the same logical 
volume, 
achieving what you  want. Or if you run on a cloud architecture, you may 
increase disk with a couple  of API calls triggered by monitoring...
 
 --
 Jan Høydahl, search  solution architect
 Cominvent AS - www.cominvent.com
 
 On 17. okt. 2010,  at 12.38, Peter Sturge wrote:
 
  Is it possible to get an index to span  multiple disk volumes - i.e.
  when its 'primary' volume fills up (or  optimize needs more room), tell
  Solr/Lucene to use a  secondary/tertiary/quaternary et al volume?
  
  I've not seen any  configuration that would allow this, but maybe
  others have a use case  for such functionality?
  
  Thanks,
  Peter
 



Re: how can i use solrj binary format for indexing?

2010-10-17 Thread Gora Mohanty
On Mon, Oct 18, 2010 at 8:31 AM, Jason, Kim hialo...@gmail.com wrote:

 Hi all
 I have a huge amount of xml files for indexing.
 I want to index using solrj binary format to get performance gain.
 Because I heard that using xml files to index is quite slow.
[...]

Do not know about SolrJ's binary format, but indexing through XML
is quite fast in our experience. Have you tried it out to see if it meets
your requirements?

Regards,
Gora