Hi,

I'm wondering how far off base I am with the question:

Is a LogicalPlan in #SparkSQL similar to a RDD in #ApacheSpark Core in
that they both seem a metadata of the computation that eventually gets
executed to produce records?

What am I missing if anything? How imprecise I am by comparing
LogicalPlan to RDD?

I'm considering a Dataset a pair of a LogicalPlan and an Encoder where
the encoder is to handle the data in a more sophisticated, low-level
way while LogicalPlan is how the data is computed/retrieved.

Please help me understand the concepts in a more accurate way. Thanks!

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to