Hi, Andrew Each index has 5 shards and each shard has 2 replica. There are 670 indices now. So there are 670 days' data. But the data for each index is not even. Small one only has 100KB(old date). Big one may has more than 1 GB(recent date).
curl 'localhost:9200/_cat/indices/?v' health index pri rep docs.count docs.deleted store.size pri.store.size green 2014-03-22 5 1 1205 11 19.7mb 9.8mb green 2013-11-08 5 2 58 0 4.8mb 1.6mb green 2014-09-05 5 2 107055 5 1.3gb 473.7mb We crawl different type social media data. Then accroding to the data's created date write into correspondent index. Different social media type will use different type, so there are several types per index. We use kibana to present data. By default, It will search latest one month's data. And I notice the search queue.size(current is 1000) is not enough. -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/22bd3e39-2113-409f-853f-203e3299e7f9%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.