-sparkr-dev@googlegroups +dev@spark.apache.org

[Please send SparkR development questions to the Spark user / dev
mailing lists. Replies inline]

> From:  <pvillaco...@stratio.com>
> Date: Tue, Jul 5, 2016 at 3:30 AM
> Subject: Call to new JObject sometimes returns an empty R environment
> To: SparkR Developers <sparkr-...@googlegroups.com>
>
>
>
>  Hi all,
>
>  I have recently moved from SparkR 1.5.2 to 1.6.0. I am doing some
> experiments using SparkR:::newJObject("java.util.HashMap") and I
> notice the behaviour has changed, and it now returns an "environment"
> instead of a "jobj":
>
>> print(class(SparkR:::newJObject("java.util.HashMap")))  # SparkR 1.5.2
> [1] "jobj"
>
>> print(class(SparkR:::newJObject("java.util.HashMap")))  # SparkR 1.6.0
> [1] "environment"
>
> Moreover, the environment returned is apparently empty (when I call
> ls() on the resulting environment, it returns character(0)) . This
> problem only happens with some Java classes. I am not able to say
> exactly which classes cause the problem.

The reason this is different in Spark 1.6 is that we added support for
automatically deserializing Maps returned from the JVM as environments
on the R side. The pull request
https://github.com/apache/spark/pull/8711 has some more details. The
reason BitSet / ArrayList "work" is that we don't do any special
serialization / de-serialization for them.

>
> If I try to create an instance of other classes such as
> java.util.BitSet, it works successfully. I thought it might be related
> with parameterized types, but it does work successfully with ArrayList
> and with HashSet, which take a parameter.
>
> Any suggestions on this change of behaviour (apart from "do not use
> private functions" :-)   ) ?

Unfortunately there isn't much more to say than that. The
serialization/de-serialization is an internal API and we don't claim
to maintain backwards compatibility. You might be able to work around
this particular issue by wrapping your Map in a different object.

Thanks
Shivaram

>
> Thank you very much
>
> --
> You received this message because you are subscribed to the Google
> Groups "SparkR Developers" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to sparkr-dev+unsubscr...@googlegroups.com.
> To post to this group, send email to sparkr-...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/sparkr-dev/14dbc4ce-2579-4008-96ae-818d8a94a4a7%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to