Hello

Is it possible to know a dataframe's total storage size in bytes? such as:

df.size()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
File "/opt/spark/python/pyspark/sql/dataframe.py", line 1660, in __getattr__ "'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
AttributeError: 'DataFrame' object has no attribute 'size'

Sure it won't work. but if there is such a method that would be great.

Thanks.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to