Thanks David .
* Yes , I test it with curl. If the json data is not too big, There
is no problem. The test json format is following:*
*{*
*name:[user1,user2,user3,],*
* product:{},*
* price:{}*
*}*
*The difference is the two json data is :*
*The last json data include too many
I see. You probably have to merge mappings with very big mappings!
What is your application searching for? Logs? Users?
--
David ;-)
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs
Le 9 janv. 2014 à 10:06, xjj210...@gmail.com a écrit :
Thanks David .
Yes , I test it with curl.
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and
Do you insert that using bulk?
--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr
Le 8 janvier 2014 at 12:29:33, xjj210...@gmail.com (xjj210...@gmail.com) a
écrit:
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I
On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet https://twitter.com/dadoonet |
@elasticsearchfrhttps://twitter.com/elasticsearchfr
Le 8 janvier 2014 at
That was not really my question. Are you using BULK feature?
--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr
Le 8 janvier 2014 at 12:38:00, xjj210...@gmail.com (xjj210...@gmail.com) a
écrit:
I use the elasticsearch version is 0.90.2
On Wednesday,
no, i don't use bulk. You mean i use bulk it maybe solve the problem?Thanks
On Wednesday, January 8, 2014 7:43:41 PM UTC+8, David Pilato wrote:
That was not really my question. Are you using BULK feature?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet
I use the elasticsearch version is 0.90.2
On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet https://twitter.com/dadoonet |
I only insert the log to elasticsearch. I will do the following wrok:
1: write the data to elasticsearch.
2: Then to search the data.
Now, when i insert the data to es, It used too much memory. I wonder why
the es use so much memory.
Could you give me some suggestions. Thanks
I use jmap
The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there
Just wondering if you are hitting the same RAM usage when inserting without
thrift?
Could you test it?
Could you gist as well what gives:
curl -XGET 'http://localhost:9200/_nodes?all=truepretty=true'
--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr
Le
13 matches
Mail list logo