Re: Marvel not showing nodes stats

2014-08-28 Thread Jeff Byrnes
Boaz,

Thanks for the response; I checked on it late yesterday, and it seems to have 
righted itself somehow. I brought up another cluster this morning, and it 
immediately received all the node  index data properly, so I must have gotten 
something stuck on my first try.

Thanks for the response!

-- 
Jeff Byrnes
@berkleebassist
Operations Engineer
EverTrue
704.516.4628

On August 28, 2014 at 4:41:13 PM, Boaz Leskes (b.les...@gmail.com) wrote:

Jeff, 

Two things can be at play:

1) Either you have no data (easily checked by direct call to the monitoring 
cluster, from Sense or curl )
2) Or something is wrong with the mapping - typically caused by missing an 
index template. Can you check wether your monitoring cluster has GET 
_template/marvel ?

Cheers,
Boaz

On Tuesday, August 26, 2014 11:23:22 PM UTC+2, Jeff Byrnes wrote:
I'm experiencing a similar issue to this. We have two clusters:
2 node monitoring cluster (1 master/data  1 just data)
5 node production cluster (2 data, 3 masters)
The output below is from the non-master data node of the Marvel monitoring 
cluster. There are no errors being reported by any of the production nodes.

[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] 
[stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], 
[P], s[STARTED]: Failed to execute 
[org.elasticsearch.action.search.SearchRequest@355e93ff]
org.elasticsearch.transport.RemoteTransportException: 
[stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: 
[.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* 
+cache(_type:index_stats) +cache(@timestamp:[140908680 TO 
140908746])))],from[-1],size[10]: Parse Failure [Failed to parse source 
[{size:10,query:{filtered:{query:{match_all:{}},filter:{bool:{must:[{match_all:{}},{term:{_type:index_stats}},{range:{@timestamp:{from:now-10m/m,to:now/m}}}],facets:{timestamp:{terms_stats:{key_field:index.raw,value_field:@timestamp,order:term,size:2000}},primaries.docs.count:{terms_stats:{key_field:index.raw,value_field:primaries.docs.count,order:term,size:2000}},primaries.indexing.index_total:{terms_stats:{key_field:index.raw,value_field:primaries.indexing.index_total,order:term,size:2000}},total.search.query_total:{terms_stats:{key_field:index.raw,value_field:total.search.query_total,order:term,size:2000}},total.merges.total_size_in_bytes:{terms_stats:{key_field:index.raw,value_field:total.merges.total_size_in_bytes,order:term,size:2000}},total.fielddata.memory_size_in_bytes:{terms_stats:{key_field:index.raw,value_field:total.fielddata.memory_size_in_bytes,order:term,size:2000]]
    at 
org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at 
org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at 
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at 
org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at 
org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at 
org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at 
org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet 
[timestamp]: failed to find mapping for index.raw
    at 
org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at 
org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at 
org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] 
[stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], 
[P], s[STARTED]: Failed to execute 
[org.elasticsearch.action.search.SearchRequest@32f235e9]
org.elasticsearch.transport.RemoteTransportException: 
[stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: 
[.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* 
+cache(_type:node_stats) +cache(@timestamp:[140908680 TO 
140908746])))],from[-1],size[10]: Parse Failure [Failed to parse source 
[{size:10,query:{filtered:{query:{match_all:{}},filter:{bool:{must:[{match_all:{}},{term:{_type:node_stats}},{range:{@timestamp:{from:now-10m/m,to:now/m}}}],facets:{timestamp:{terms_stats:{key_field:node.ip_port.raw,value_field:@timestamp,order:term,size:2000

Re: Marvel not showing nodes stats

2014-08-26 Thread Jeff Byrnes
I'm experiencing a similar issue to this. We have two clusters:

   - 2 node monitoring cluster (1 master/data  1 just data)
   - 5 node production cluster (2 data, 3 masters)
   
The output below is from the non-master data node of the Marvel monitoring 
cluster. There are no errors being reported by any of the production nodes.

[2014-08-26 21:10:51,503][DEBUG][action.search.type   ] 
[stage-search-marvel-1c] [.marvel-2014.08.26][2], 
node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute 
[org.elasticsearch.action.search.SearchRequest@355e93ff]
org.elasticsearch.transport.RemoteTransportException: 
[stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: 
[.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* 
+cache(_type:index_stats) +cache(@timestamp:[140908680 TO 
140908746])))],from[-1],size[10]: Parse Failure [Failed to parse source 
[{size:10,query:{filtered:{query:{match_all:{}},filter:{bool:{must:[{match_all:{}},{term:{_type:index_stats}},{range:{@timestamp:{from:now-10m/m,to:now/m}}}],facets:{timestamp:{terms_stats:{key_field:index.raw,value_field:@timestamp,order:term,size:2000}},primaries.docs.count:{terms_stats:{key_field:index.raw,value_field:primaries.docs.count,order:term,size:2000}},primaries.indexing.index_total:{terms_stats:{key_field:index.raw,value_field:primaries.indexing.index_total,order:term,size:2000}},total.search.query_total:{terms_stats:{key_field:index.raw,value_field:total.search.query_total,order:term,size:2000}},total.merges.total_size_in_bytes:{terms_stats:{key_field:index.raw,value_field:total.merges.total_size_in_bytes,order:term,size:2000}},total.fielddata.memory_size_in_bytes:{terms_stats:{key_field:index.raw,value_field:total.fielddata.memory_size_in_bytes,order:term,size:2000]]
at 
org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
at 
org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
at 
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
at 
org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
at 
org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
at 
org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
at 
org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: 
Facet [timestamp]: failed to find mapping for index.raw
at 
org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
at 
org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
at 
org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
... 9 more
[2014-08-26 21:10:51,503][DEBUG][action.search.type   ] 
[stage-search-marvel-1c] [.marvel-2014.08.26][2], 
node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute 
[org.elasticsearch.action.search.SearchRequest@32f235e9]
org.elasticsearch.transport.RemoteTransportException: 
[stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: 
[.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* 
+cache(_type:node_stats) +cache(@timestamp:[140908680 TO 
140908746])))],from[-1],size[10]: Parse Failure [Failed to parse source 
[{size:10,query:{filtered:{query:{match_all:{}},filter:{bool:{must:[{match_all:{}},{term:{_type:node_stats}},{range:{@timestamp:{from:now-10m/m,to:now/m}}}],facets:{timestamp:{terms_stats:{key_field:node.ip_port.raw,value_field:@timestamp,order:term,size:2000}},master_nodes:{terms:{field:node.ip_port.raw,size:2000},facet_filter:{term:{node.master:true}}},os.cpu.usage:{terms_stats:{key_field:node.ip_port.raw,value_field:os.cpu.usage,order:term,size:2000}},os.load_average.1m:{terms_stats:{key_field:node.ip_port.raw,value_field:os.load_average.1m,order:term,size:2000}},jvm.mem.heap_used_percent:{terms_stats:{key_field:node.ip_port.raw,value_field:jvm.mem.heap_used_percent,order:term,size:2000}},fs.total.available_in_bytes:{terms_stats:{key_field:node.ip_port.raw,value_field:fs.total.available_in_bytes,order:term,size:2000}},fs.total.disk_io_op:{terms_stats:{key_field:node.ip_port.raw,value_field:fs.total.disk_io_op,order:term,size:2000]]
at 
org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
at