Hi Jorn,
Just want to check if you got a chance to look at this problem. I couldn't
figure out any reason on why this is happening. Any help would be
appreciated.
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
I have already send minimum 10 times! Today also I have send one!
On Tue, Oct 30, 2018 at 3:51 PM Biplob Biswas
wrote:
> You need to send the email to user-unsubscr...@spark.apache.org and not
> to the usergroup.
>
> Thanks & Regards
> Biplob Biswas
>
>
> On Tue, Oct 30, 2018 at 10:59 AM Anu B N
You need to send the email to user-unsubscr...@spark.apache.org and not to
the usergroup.
Thanks & Regards
Biplob Biswas
On Tue, Oct 30, 2018 at 10:59 AM Anu B Nair wrote:
> I am sending this Unsubscribe mail for last few months! It never happens!
> If anyone can help us to unsubscribe it wil
I am sending this Unsubscribe mail for last few months! It never happens!
If anyone can help us to unsubscribe it wil be really helpful!
On Tue, Oct 30, 2018 at 3:27 PM Mohan Palavancha
wrote:
>
>
I would like to know if its possible to invoke python spark code from java.
I have a java based framework where
a sparksession is created and a some dataframes are passed as argument to
an api .
Transformation.java
interface Transformation
{
Dataset transform(Set inputDatasets , SparkSess
Older versions of Spark had indeed a lower performance on Python and R due to a
conversion need between JVM datatypes and python/r datatypes. This changed in
Spark 2.2, I think, with the integration of Apache Arrow. However, what you do
after the conversion in those languages can be still slowe
Super,
Now it makes sense, I am copying Holden in this email.
Regards,
Gourav
On Tue, 30 Oct 2018, 06:34 lchorbadjiev,
wrote:
> Hi Gourav,
>
> the question in fact is are there any the limitations of Apache Spark
> support for Parquet file format.
>
> The example schema from the dremel paper
Are you using the same parquet version as Spark uses? Are you using a recent
version of Spark? Why don’t you create the file in Spark?
> Am 30.10.2018 um 07:34 schrieb lchorbadjiev :
>
> Hi Gourav,
>
> the question in fact is are there any the limitations of Apache Spark
> support for Parquet f