So the bug is-- using a map reduce job that 'includes' nonexistent
bucket/key causes an error.  Correct?

I assumed that we would get a "not found" response back same as with the
rest interface and thus assumed by encoding was what was causing the
problem.  My mistake.

Upon further reflection, I realized that base64 encoding wouldn't work
unless I stored the values that way single it reads the utf8 as strings
correctly .  My plan is store my values directly as byte values.  This is
easy with the protobuf bytes types.  However, how would I then encode them
in the includes section of the map reduce json block?  I've noticed that
there is a secondary erlang format that can be passed for map reduce jobs,
must I use that?  If so, does anyone have an example of a generating one of
these from within Java?

Two other questions:
1. I noticed some earlier discussions about the protobuf client being alpha
state (I think that was from a number of months ago).  Is it generally safe
to use these days?
2. Are the request and response threads in Riak separate or sequential.  For
example, if I send 5 normal PbcGetReq requests in quick succession on a
single socket does Riak finish the first one before starting on requests
2-5?  Or does it rather thread the requests out as they come in so it will
get 2-5 simultaneously?  I'm asking this because I'm trying to figure out
how much I should try to reuse a single socket connection.

Thanks for your help,
Jacques





On Thu, Jun 2, 2011 at 1:00 AM, Russell Brown <[email protected]> wrote:

> Hi Jaques,
>
> Where to start...hm.
>
> On 2 Jun 2011, at 02:47, Jacques wrote:
>
> I'm using Java and looking to replicate multi-get using a map-only job per
> everyone's recommendation.
>
> My bucket and key names are binary values, not strings.
>
> I attempted to run a map reduce job using a json input object, base 64
> encoding the values.  This failed.
>
> What is the correct way to submit a pbc map reduce job where the inputs
> info is binary?
>
> Thanks,
> Jacques
>
>
>
> *Error when trying base 64 values:*
> ** Reason for termination ==
> **
> {json_encode,{bad_term,{not_found,{<<"dGVzdDI=">>,<<"dGVzdEtleQ==">>}, 
> undefined}}}
>
>
> This is a bug in the pb interface of Riak that I have patched and will be
> in master soon. The error is because the {not_found} term you see in the log
> is not able to be serialised to JSON. The fix is here
> https://github.com/basho/riak_kv/pull/103
>
> However,  this error just means that your bucket/key combo
> <<"dGVzdDI=">>,<<"dGVzdEtleQ==">> is not found. Which is another problem all
> together.
>
> Are you're bucket and key names actual binary values, or base64 encoded
> binary values?
>
> Cheers
>
> Russell
>
>
>
> JSON {
>     "inputs": [
>         [
>             "dGVzdDI=",
>             "dGVzdEtleQ=="
>         ],
>         [
>             "dGVzdDI=",
>             "dGVzdEtleQ=="
>         ]
>     ],
>     "query": [{"map": {
>         "keep": true,
>         "language": "javascript",
>         "source": "function(v) { return [v]; }"
>     }}]
> }
>
>
> _______________________________________________
> riak-users mailing list
> [email protected]
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>
>
>
_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to