We're actually running on on-prem Kubernetes with a custom-built build Spark
image, with altered entrypoint.sh and other "low-level" scripts and configs,
but I don't think this is a good direction to solve this specific issue.
Shay
From: Artemis User
Sent: Thurs
You can try SequenceFileRDDFunctions.saveAsSequenceFile or RDD.saveAsObjectFile
that serialize data to (NullWritable, BytesWritable)
> 2022年11月14日 21:07,Shrikant Prasad 写道:
>
> I have tried with that also. It gives same exception:
> ClassNotFoundException: sequencefile.DefaultSource
>
> Rega
I have tried with that also. It gives same exception:
ClassNotFoundException: sequencefile.DefaultSource
Regards,
Shrikant
On Mon, 14 Nov 2022 at 6:35 PM, Jie Han wrote:
> It seems that the name is “sequencefile”.
>
> > 2022年11月14日 20:59,Shrikant Prasad 写道:
> >
> > Hi,
> >
> > I have an applic
It seems that the name is “sequencefile”.
> 2022年11月14日 20:59,Shrikant Prasad 写道:
>
> Hi,
>
> I have an application which writes a dataframe into sequence file using
> df.write.format("sequence").insertInto("hivetable1")
>
> This was working fine with Spark 2.7.
> Now I am trying to migrate
Hi,
I have an application which writes a dataframe into sequence file using
df.write.format("sequence").insertInto("hivetable1")
This was working fine with Spark 2.7.
Now I am trying to migrate to Spark 3.2. Getting ClassNotFoundException:
sequence.DefaultSource error with Spark 3.2.
Is there a