I guess it has to do with the Tungsten explicit memory management that
builds on sun.misc.Unsafe. The "ConvertToUnsafe" class converts
Java-object-based rows into UnsafeRows, which has the Spark internal
memory-efficient format.
Here is the related code in 1.6:
ConvertToUnsafe is defined in:
http
Hi,
In Spark's Physical Plan what is the explanation for ConvertToUnsafe?
Example:
scala> sorted.filter($"prod_id" ===13).explain
== Physical Plan ==
Filter (prod_id#10L = 13)
+- Sort [prod_id#10L ASC,cust_id#11L ASC,time_id#12 ASC,channel_id#13L
ASC,promo_id#14L ASC], true, 0
+- ConvertToUns