Thank you again Christopher,
An example would be awesome! Although I really need only one line of code
that shows how to pass specific keys to mapred function to operate on (in
the case when they use bucket type).
As to my map function: it really does nothing - simply returns key-value
pairs,
It is expected that the total amount of data per node lowers quite a lot,
correct? I'm doubling the size of the cluster (6 more nodes).
I ask this because the actual 6 machines have 1.5Tb in disks, but the new ones
( for now) have only 1Tb.
Best regards
—
Sent from my iPhone
On Sat,
Yes, I have clearly a problem with writing a file.
You're right, I will contact Heroku for that.
May I ask you the steps you follow to build this release on Heroku ?
Regards
Corentin
2015-02-05 18:31 GMT+01:00 Christopher Meiklejohn cmeiklej...@basho.com:
On Feb 4, 2015, at 10:59 PM,
YouBarco writes:
Hello,
My OS is ubuntu 14.04 64bit, and installed erlang from source with version
R16B as following:
That's your problem. You MUST use the custom Basho branch of Erlang/OTP
with Riak. If you insist on building Erlang/Riak from source then follow
this guide for Erlang:
Nirav, are you using CRDTs or plain old types with Riak? The definition for
field names makes a big difference in what gets archived and solr will not
complain if it couldn’t find matching fields, it just won’t index them. You can
take a peek at the data dir on the Riak instance to see what, if
Hi Chris,
Thank you for the prompt reply.
Although that is exactly what I do. I've noticed that bucket can now be
both binary or a tuple {binary, binary} where the first element is bucket
type and the second is bucket. And it works for put/get operations and for
mapred_bucket which traverses the
By the look of it it seems returnTerm is available in 1.3+ and regexp
matching got merged into 2.0?
Also is there any documentation what subset of Perl regexp is supported?
Thanks
Daniel
--
View this message in context:
Nirav,
Could you possibly detail the steps you used to upload the schema, adjust
the bucket properties, etc.? That would help us identify the issue.
Luc
On Thu, Feb 5, 2015 at 9:42 AM, Nirav Shah niravis...@yahoo.com wrote:
Hi Shawn,
Thanks for the response. To give you some background
1.
On Feb 5, 2015, at 10:55 AM, Mikhail Pustovalov mpustova...@gmail.com wrote:
Hello,
I am using MapReduce just as a way to get multiple keys in one query (I
couldn't find a better way). My code used to work with Riak v.1.4 but now
when I try to run it against the latest version (2.0.4)
Hi All,Just wanted to check what kind of configuration settings does everyone
use in production clustered environment for Riak Search/AAE and if someone can
share some experience over it? We currently have a 2g memory allocated to Solr
and are currently just using the default parameters from
Hi Shawn,I am using plain old types. Some of the fields we index ends with Id
like execId, verId, orderId and are defined as long. There are some that has
random strings which are defined as yz_str. Do you think this fields can cause
issues ?
Surprisingly, i am seeing some data and some are
The only thing I can think of is that your flattened full name is not being
matched. Also looking at Basho’s default schema, it should be “_yz_str” and
not “yz_str”, and that only works if you actually have it defined as:
types
!-- YZ String: Used for non-analyzed fields --
On Feb 5, 2015, at 12:47 PM, Mikhail Pustovalov mpustova...@gmail.com wrote:
Hi Chris,
Thank you for the prompt reply.
Although that is exactly what I do. I've noticed that bucket can now be both
binary or a tuple {binary, binary} where the first element is bucket type and
the second
Thank you Christopher for your fast answer. However, I did not mentionned
that I was using cedar-14 stack. With an ephemeral file system : the file
system enable writes operations, but wrtten files are not saved when a dyno
restart [1]. Moreover, I can write file, because the make command did not
Hi Nirav,
About your last point. Just yesterday I started playing with Search 2.0 (solr)
and riak. Basho did a good job at integrating the solr platform but docs are
sometimes misleading. One thing I found out was the using the default schema
provided by Basho, if you are using CRDTs, your
On Feb 4, 2015, at 10:59 PM, Corentin Jechoux
corentin.jechoux@gmail.com wrote:
Thank you Christopher for your fast answer. However, I did not mentionned
that I was using cedar-14 stack. With an ephemeral file system : the file
system enable writes operations, but wrtten files are
Hi Shawn,Thanks for the response. To give you some background
1. We are using custom schema with default bucket type2. I have the search set
to on:)3. I have associated BucketProperties/Index to my buckets..4. What i am
seeing is, i am getting data back, but for some reason i am not getting the
I don’t think problems in file limit. Problems are in that Riak cannot start
due to Erlang problem (something like this). I had the same error before
What i did?
I have deleted everything that says about riak and Erlang,
installed OTP_R16B02 and otp_r16b02
then installed Riak 2.0 +
And it
Hello,
My OS is ubuntu 14.04 64bit, and installed erlang from source with version R16B
as following:
---
ubuntu@riak1:~/riak-2.0.4/dev$ erl
Erlang R16B (erts-5.10.1)
Hello,
I am using MapReduce just as a way to get multiple keys in one query (I
couldn't find a better way). My code used to work with Riak v.1.4 but now
when I try to run it against the latest version (2.0.4) mapred queries
return {error, notfound} for each key supplied.
I have created a bucket
I would probably take a look at the ulimit docs:
http://docs.basho.com/riak/latest/ops/tuning/open-files-limit/ . This becomes
more acute the more nodes you run on the same os.
-Alexander
@siculars
http://siculars.posthaven.com
Sent from my iRotaryPhone
On Feb 5, 2015, at 01:35, YouBarco
Hi Luc,Thanks for the response.
Here are the steps i performed on my application start up. I am using the
default bucket type in my application
1. Create Custom Schema
String schema = Source.fromInputStream(inputStream,
UTF-8).mkString(); (inputStream comes from a
22 matches
Mail list logo