Thanks guys. Now I can easily search thru 10TB of my personal photos,
videos, music and other stuff :)

At some point I had split them into multiple db and tables and inserts
to a single db/ table were taking too much time once the index grew
beyond 1gig. I was storing all the possible metadata about the media.
I used two hex characters for naming tables/dbs and ended up with 256
db, each with 256 tables :D . Don't ask me why I had done it this way.
Let's just say I was exploring sharding some years ago and got too
excited and did that :D. Alas, never touched it again to finish the
search portion till now when I really wanted to find a particular
photo :)

The pk is unique across all the tables so no issues there. I think I
should be able to run it off a single server at my home.

Thanks and Best Regards,
Jayant

On Wed, Oct 7, 2009 at 4:52 AM, Shalin Shekhar Mangar
<shalinman...@gmail.com> wrote:
> On Wed, Oct 7, 2009 at 5:09 PM, Sandeep Tagore 
> <sandeep.tag...@gmail.com>wrote:
>
>>
>> You can write an automated program which will change the DB conf details in
>> that xml and fire the full import command. You can use
>> http://localhost:8983/solr/dataimport url to check the status of the data
>> import.
>>
>>
> Also note that full-import deletes all existing documents. So if you write
> such a program which changes DB conf details, make sure you invoke the
> "import" command (new in Solr 1.4) to avoid deleting the other documents.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>



-- 
www.jkg.in | http://www.jkg.in/contact-me/
Jayant Kr. Gandhi

Reply via email to