I use update to update json object, this is some wrong.Please help me.
1,I insert a log to es.
curl -XPOST 'localhost:9200/12/1/3/_update' -d
'{doc:{price:[{costprice:1,icsonprice:1,name:1123,pid:1,pricetype:1,purchaseprice:1,shipmentprice:1}],product:{status:2330,type:120}}}'
2: I use api to
Hi all:
If the data format is following, can elasticsearch query it correctly?
The first data like this:
A:
{
attrspec_id:[{attr_id:55,attr_value:barnd,optiona:[{option_id:5710,option_value:Dell},{attr_id:39715,attr_value:type,option_id:0,option_value:8135}]
}
The second data like
Thanks David .
* Yes , I test it with curl. If the json data is not too big, There
is no problem. The test json format is following:*
*{*
*name:[user1,user2,user3,],*
* product:{},*
* price:{}*
*}*
*The difference is the two json data is :*
*The last json data include too many
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 2, it used about 30G memory, and then
elasticsearch is very slow, and
On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet https://twitter.com/dadoonet |
@elasticsearchfrhttps://twitter.com/elasticsearchfr
Le 8 janvier 2014 at
no, i don't use bulk. You mean i use bulk it maybe solve the problem?Thanks
On Wednesday, January 8, 2014 7:43:41 PM UTC+8, David Pilato wrote:
That was not really my question. Are you using BULK feature?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet
I use the elasticsearch version is 0.90.2
On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?
--
*David Pilato* | *Technical Advocate* | *Elasticsearch.com*
@dadoonet https://twitter.com/dadoonet |
I only insert the log to elasticsearch. I will do the following wrok:
1: write the data to elasticsearch.
2: Then to search the data.
Now, when i insert the data to es, It used too much memory. I wonder why
the es use so much memory.
Could you give me some suggestions. Thanks
I use jmap
The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45
On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 1 logs to elasticsearch, each log is about 2M, and
there
11 matches
Mail list logo