hi, thanks, and now i can index data from hbase to the solr server using
nutch core.
but the indexdata will be local storage,that 's what i worry about,to be too
large in local.

MountableHDFS i never use it ,i am not sure weather solr can write the index
into HDFS,i doubt it
can work without implements Writable in HDFS.

and i think the point is the reading and writing the indexfile in HDFS just
like it in local filesystem ,
can u make a new index file format witch can use in the HDFS, if it can ,i
think that will
be a great helpful to distrabuted index.

if solr built on top of lucene , will it be easy to implement the HDFS file
format?

2009/9/24 Amit Nithian <anith...@gmail.com>

> Would FUSE (http://wiki.apache.org/hadoop/MountableHDFS) be of use?
> I wonder if you could take the data from HBase and index it into a Lucene
> index stored on HDFS.
>
> 2009/9/23 Noble Paul നോബിള്‍ नोब्ळ् <noble.p...@corp.aol.com>
>
> > can hbase be mounted on the filesystem? Solr can only read data from a
> >  filesystem
> >
> > On Thu, Sep 24, 2009 at 7:27 AM, 梁景明 <futur...@gmail.com> wrote:
> > > hi,  i use hbase and solr ,now i have a large data need to index ,it
> > means
> > > solr-index  will be large,
> > > as the data increases,it will be more larger than now.
> > > so  solrconfig.xml 's <dataDir>/solrhome/data</dataDir> ,can i used it
> > from
> > > api ,and point to my
> > > distrabuted hbase data storage,
> > > and if the index is too large ,will it be slow?
> > > thanks.
> > >
> >
> >
> >
> > --
> > -----------------------------------------------------
> > Noble Paul | Principal Engineer| AOL | http://aol.com
> >
>

Reply via email to