Hi Alex,

I can do a common queries.
Below are the json result for "*:*" query:

{
  "responseHeader": {
    "status": 0,
    "QTime": 0,
    "params": {
      "indent": "true",
      "q": "*:*",
      "_": "1382938341864",
      "wt": "json"
    }
  },
  "response": {
    "numFound": 0,
    "start": 0,
    "docs": []
  }
}




On Mon, Oct 28, 2013 at 9:11 AM, Alexandre Rafalovitch
<arafa...@gmail.com>wrote:

> Can you do queries? Maybe the default collection was somehow not setup and
> you need to provide collection name explicitly. What endpoints does admin
> interface use when you do a query?
>
> Regards,
>    Alex.
>
> Personal website: http://www.outerthoughts.com/
> LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
> - Time is the quality of nature that keeps events from happening all at
> once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD book)
>
>
> On Mon, Oct 28, 2013 at 8:54 AM, Bayu Widyasanyata
> <bwidyasany...@gmail.com>wrote:
>
> > Additional info:
> >
> > - I use Tomcat 7.0.42
> > - Following are Tomcat/catalina's log when nutch failed on Solr index
> > process. It replies "404 error":
> >
> > 10.1.160.40 - - [28/Oct/2013:08:50:02 +0700] "POST
> > /solr/update?wt=javabin&version=2 HTTP/1.1" 404 973
> > 10.1.160.40 - - [28/Oct/2013:08:50:02 +0700] "POST
> > /solr/update?wt=javabin&version=2 HTTP/1.1" 404 973
> >
> > Thanks.-
> >
> >
> >
> > On Mon, Oct 28, 2013 at 7:19 AM, Bayu Widyasanyata
> > <bwidyasany...@gmail.com>wrote:
> >
> > > Hi,
> > >
> > > I just installed Nutch 1.7 and latest Solr 4.5.1 successfully.
> > > But I got the error when execute the crawl script
> > > (./bin/crawl urls/seed.txt TestCrawl http://localhost:8080/solr/ 2)
> > >
> > > The error is occured on Solr Indexer step.
> > > Following the error on hadoop.log:
> > >
> > > 2013-10-28 06:16:59,815 WARN  mapred.LocalJobRunner -
> > > job_local1930559258_0001
> > > org.apache.solr.common.SolrException: Not Found
> > >
> > > Not Found
> > >
> > > request: http://localhost:8080/solr/update?wt=javabin&version=2
> > >         at
> > >
> >
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:430)
> > >         at
> > >
> >
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:244)
> > >         at
> > >
> >
> org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
> > >         at
> > >
> >
> org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:155)
> > >         at
> > > org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:118)
> > >         at
> > >
> >
> org.apache.nutch.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:44)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.close(ReduceTask.java:467)
> > >         at
> > > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:535)
> > >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
> > >         at
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:398)
> > > 2013-10-28 06:17:00,243 ERROR indexer.IndexingJob - Indexer:
> > > java.io.IOException: Job failed!
> > >         at
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
> > >         at
> > org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:123)
> > >         at
> org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:185)
> > >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >         at
> > org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:195)
> > >
> > > I suspect the problem is broken URI ("Not Found" message) of
> > > http://localhost:8080/solr/update?wt=javabin&version=2
> > >
> > > That URI was also report not found when I accessed from browser
> directly.
> > >
> > > Is there any configuration that I missed?
> > >
> > > Thanks.-
> > >
> > > --
> > > wassalam,
> > > [bayu]
> > >
> >
> >
> >
> > --
> > wassalam,
> > [bayu]
> >
>



-- 
wassalam,
[bayu]

Reply via email to