Kyle -
Verify return code: 19 (self signed certificate in certificate chain)
Since your server cert is self-signed, there's not much more that can
be done at this point I believe. My security tests use a dedicated CA
where the Root cert is available for validation
Hi Luke,
I am getting the following information:
osboxes@osboxes:/etc/riak$ openssl s_client -debug -connect 10.0.2.15:8088
CONNECTED(0003)
write to 0x24244c0 [0x2424a60] (295 bytes => 295 (0x127))
- 16 03 01 01 22 01 00 01-1e 03 03 d8 cb 68 b8 45 "h.E
0010 - 9a c3 54 21
Travis,
I cannot address the crash error you see in the logs. Someone else will have
to review that problem.
I want to point you to this wiki article:
https://github.com/basho/leveldb/wiki/riak-tuning-2
The article details how you can potentially increase Riak's throughput by
restricting
Magnus
Thanks for your reply. We’re are using the riack C client library for riak
(https://github.com/trifork/riack) which is used within an application called
MapCache to store 256x256 px images with a corresponding key within riak.
Currently we have 75 million images to transfer from disk
I did try it with the command below, followed by the error. I am not sure how
to specify that it should go to our s3 cluster, but I tried it after the @ sign
and then the bucket. It goes to Amazon, which seems to be by default. I know
it isn't an issue with Riak and that technically it
Hey Eric,
Very cool. Thank you!
From: riak-users on behalf of Eric Johnson
Date: Monday, August 29, 2016 at 8:34 AM
To: "riak-users@lists.basho.com"
Subject: Re: [ANN] Riak TS v1.4 is released
Hey Chris,
Riak CS provides an S3 capable API, so theoretically it could work.
Have you tried? If so and you're having issues, follow up here.
--
Luke Bakken
Engineer
lbak...@basho.com
On Wed, Aug 31, 2016 at 7:38 AM, Valenti, Anthony
wrote:
> Has anyone setup Hadoop to be able
Has anyone setup Hadoop to be able use Raik CS as an S3 source/destination
instead of or in addition to Amazon S3? Hadoop assumes that it should go to
Amazon S3 by default. Specifically, I am trying to use Hadoop distcp to copy
files to Riak CS.
Thanks,
Anthony
On 26 August 2016 at 22:20, Travis Kirstine <
tkirst...@firstbasesolutions.com> wrote:
> Is there any way to speed up bulk loading? I wondering if I should be
> tweeking the erlang, aae or other config options?
>
>
>
>
>
Hi Travis,
Excuse the late reply; your message had been stuck in the
Is there any way to speed up bulk loading? I wondering if I should be tweeking
the erlang, aae or other config options?
___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
10 matches
Mail list logo