Multi part field

2013-12-09 Thread steven crichton
I am trying to implement a ranged field type in a booking system.

The price structure is variable between 2 dates (determined by the property
owner)
So it looks like this 

Date A - Date B = Price Value

I've been looking through a lot of docs, but so far have not been able to
find how I could possibly implement such an object within SOLR.

the only thing I have so fa thought of is have 2 fields

-DATE PRICE RANGE
- PRICE RANGE VAL

then get the index of the DATE PRICE RANGE array object that matches and
apply that to the PRICE RANGE VAL to get the value.


Any help would be very appreciated on this as it's the make or break of the
new search system for our site just now.



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Multi-part-field-tp4105685.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multi part field - EXAMPLE DATA

2013-12-09 Thread steven crichton
prices: [{start-date: 05-01-2013,end-date: 02-03-2013,price:
760},{start-date: 02-03-2013,end-date: 06-04-2013,price:
800},{start-date: 06-04-2013,end-date: 01-06-2013,price:
1028},{start-date: 01-06-2013,end-date: 29-06-2013,price:
1240},{start-date: 29-06-2013,end-date: 06-07-2013,price:
1340},{start-date: 06-07-2013,end-date: 10-08-2013,price:
1678},{start-date: 10-08-2013,end-date: 24-08-2013,price:
1578},{start-date: 24-08-2013,end-date: 31-08-2013,price:
1340},{start-date: 31-08-2013,end-date: 21-09-2013,price:
1240},{start-date: 21-09-2013,end-date: 19-10-2013,price:
1028},{start-date: 19-10-2013,end-date: 02-11-2013,price:
800},{start-date: 02-11-2013,end-date: 11-01-2014,price: 760}],



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Multi-part-field-tp4105685p4105686.html
Sent from the Solr - User mailing list archive at Nabble.com.

Solr Stalls on Bulk indexing, no logs or errors

2013-12-04 Thread steven crichton
I am finding with a bulk index using SOLR 4.3 on Tomcat, that when I reach
69578 records the server stops adding anything more. 

I've tried reducing the data sent to the bare minimum of fields and using
ASC and DESC data to see if it could be a field issue.

Is there anything I could look at for this? As I'm not finding anything
similar noted before. Does tomcat have issues with closing connections that
look like DDOS attacks? Or could it be related to too many commits in too
short a time?

Any help will be very greatly appreciated.



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr Stalls on Bulk indexing, no logs or errors

2013-12-04 Thread steven crichton
Yes I can continue to query after this importer goes down and whilst it running.

The bulk commit is done via a JSON handler in php. There is 121,000 records 
that need to go into the index. So this is done in 5000 chunked mySQL retrieve 
calls and parsing to the data as required. 

workflow:

get record
create {add doc… } JSON
Post to CORE/update/json


I stopped doing a hard commit every 1000 records. To see if that was an issue.


the auto commit settings are ::

autoCommit
  maxDocs${solr.autoCommit.MaxDocs:5000}/maxDocs
  maxTime${solr.autoCommit.MaxTime:24000}/maxTime
/autoCommit


I’ve pretty much worked out of the drupal schemas for SOLR 4
https://drupal.org/project/apachesolr

At one point I thought it could be malformed data, but even reducing the 
records down to just the id and title now .. it crashes at the same point. As 
in the query still works but the import handler does nothing at all


Tomcat logs seem to indicate no major issues.


There’s not a strange variable that is set to make an upper index limit is 
there?

Regards,
Steven



On 4 Dec 2013, at 20:02, Erick Erickson [via Lucene] 
ml-node+s472066n4104984...@n3.nabble.com wrote:

 There's a known issue with SolrCloud with multiple shards, but 
 you haven't told us whether you're using that. The test for 
 whether you're running in to that is whether you can continue 
 to _query_, just not update. 
 
 But you need to tell us more about our setup. In particular 
 hour commit settings (hard and soft), your solrconfig settings, 
 particularly around autowarming, how you're bulk indexing, 
 SolrJ? DIH? a huge CSV file? 
 
 Best, 
 Erick 
 
 
 On Wed, Dec 4, 2013 at 2:30 PM, steven crichton [hidden email]wrote: 
 
  I am finding with a bulk index using SOLR 4.3 on Tomcat, that when I reach 
  69578 records the server stops adding anything more. 
  
  I've tried reducing the data sent to the bare minimum of fields and using 
  ASC and DESC data to see if it could be a field issue. 
  
  Is there anything I could look at for this? As I'm not finding anything 
  similar noted before. Does tomcat have issues with closing connections that 
  look like DDOS attacks? Or could it be related to too many commits in too 
  short a time? 
  
  Any help will be very greatly appreciated. 
  
  
  
  -- 
  View this message in context: 
  http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981.html
  Sent from the Solr - User mailing list archive at Nabble.com. 
  
 
 
 If you reply to this email, your message will be added to the discussion 
 below:
 http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981p4104984.html
 To unsubscribe from Solr Stalls on Bulk indexing, no logs or errors, click 
 here.
 NAML





--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981p4104990.html
Sent from the Solr - User mailing list archive at Nabble.com.