There could be multiple ways of getting this done, and the exact one depends
a lot on factors like - what system are you using? How realtime the change
has to be reflected back into the system? How is the indexing/replication
done?
Usually, in cases where the tolerance is about 6hrs (i.e. your DB change
wont be reflected in Solr Index for as high as 6hrs), you can set up a cron
job to be triggered every 6 hrs. It will see all the changes made between
that time, and update Index and commit it.
In cases, where a more real time requirement, there could be a trigger in
the application (and not at the db level), which would fork a process to
update Solr about this change by means of delayed task. If using this
approach, it is suggested to use autocommit every N documents, N could be
anything depending your app.
*Pranav Prakash*
temet nosce
Twitter http://twitter.com/pranavprakash | Blog http://blog.myblive.com |
Google http://www.google.com/profiles/pranny
On Sun, Jul 31, 2011 at 02:32, Alexei Martchenko
ale...@superdownloads.com.br wrote:
I always have a field in my databases called datelastmodified, so whenever
I
update that record, i set it to getdate() - mssql func - and then get all
latest records order by that field.
2011/7/29 Mohammed Lateef Hussain mohammedlateefh...@gmail.com
Hi
Need some help in Solr incremental indexing approch.
I have built my Solr index using SolrJ API and now want to update the
index
whenever any changes has been made in
database. My requirement is not to use DB triggers to call any update
events.
I want to update my index on the fly whenever my application updates any
record in database.
Note: My indexing logic to get the required data from DB is some what
complex and involves many tables.
Please suggest me how can I proceed here.
Thanks
Lateef
--
*Alexei Martchenko* | *CEO* | Superdownloads
ale...@superdownloads.com.br | ale...@martchenko.com.br | (11)
5083.1018/5080.3535/5080.3533