Tobias,
Understand and thanks for quick resolution of problem.
Thanks
~Rahul
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/serialization-issue-in-case-of-case-class-is-more-than-1-tp20334p20446.html
Sent from the Apache Spark User List mailing list
On Fri, Dec 5, 2014 at 7:12 AM, Tobias Pfeiffer t...@preferred.jp wrote:
Rahul,
On Fri, Dec 5, 2014 at 2:50 PM, Rahul Bindlish
rahul.bindl...@nectechnologies.in wrote:
I have done so thats why spark is able to load objectfile [e.g.
person_obj]
and spark has maintained serialVersionUID
It's an easy mistake to make... I wonder if an assertion could be
implemented that makes sure the type parameter is present.
We could use the NotNothing pattern
http://blog.evilmonkeylabs.com/2012/05/31/Forcing_Compiler_Nothing_checks/
but I wonder if it would just make the method signature
Is it a limitation that spark does not support more than one case class at a
time.
Regards,
Rahul
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/serialization-issue-in-case-of-case-class-is-more-than-1-tp20334p20415.html
Sent from the Apache Spark User
On Fri, Dec 5, 2014 at 12:53 PM, Rahul Bindlish
rahul.bindl...@nectechnologies.in wrote:
Is it a limitation that spark does not support more than one case class at
a
time.
What do you mean? I do not have the slightest idea what you *could*
possibly mean by to support a case class.
Tobias
Hi Tobias,
Thanks Tobias for your response.
I have created objectfiles [person_obj,office_obj] from
csv[person_csv,office_csv] files using case classes[person,office] with API
(saveAsObjectFile)
Now I restarted spark-shell and load objectfiles using API(objectFile).
*Once any of one
Rahul,
On Fri, Dec 5, 2014 at 1:29 PM, Rahul Bindlish
rahul.bindl...@nectechnologies.in wrote:
I have created objectfiles [person_obj,office_obj] from
csv[person_csv,office_csv] files using case classes[person,office] with API
(saveAsObjectFile)
Now I restarted spark-shell and load
Tobias,
Thanks for quick reply.
Definitely, after restart case classes need to be defined again.
I have done so thats why spark is able to load objectfile [e.g. person_obj]
and spark has maintained serialVersionUID [person_obj].
Next time when I am trying to load another objectfile [e.g.
Rahul,
On Fri, Dec 5, 2014 at 2:50 PM, Rahul Bindlish
rahul.bindl...@nectechnologies.in wrote:
I have done so thats why spark is able to load objectfile [e.g. person_obj]
and spark has maintained serialVersionUID [person_obj].
Next time when I am trying to load another objectfile [e.g.
Tobias,
Find csv and scala files and below are steps:
1. Copy csv files in current directory.
2. Open spark-shell from this directory.
3. Run one_scala file which will create object-files from csv-files in
current directory.
4. Restart spark-shell
5. a. Run two_scala file, while running it is
Rahul,
On Fri, Dec 5, 2014 at 3:51 PM, Rahul Bindlish
rahul.bindl...@nectechnologies.in wrote:
1. Copy csv files in current directory.
2. Open spark-shell from this directory.
3. Run one_scala file which will create object-files from csv-files in
current directory.
4. Restart spark-shell
11 matches
Mail list logo