Sure. I ran the same job with fewer columns, the exception:
java.lang.IllegalArgumentException: requirement failed: DataFrame must
have the same schema as the relation to which is inserted.
DataFrame schema: StructType(StructField(pixel0,ByteType,true),
StructField(pixel1,ByteType,true),
Hi,
thanks for the answers. If joining the DataFrames is the solution, then why
does the simple withColumn() succeed for some datasets and fail for others?
2016-02-11 11:53 GMT+01:00 Michał Zieliński :
> I think a good idea would be to do a join:
>
> outputDF =
Seems like a bug.
Suggest filing an issue with code snippet if this can be reproduced on 1.6
branch.
Cheers
On Fri, Feb 12, 2016 at 4:25 AM, Zsolt Tóth
wrote:
> Sure. I ran the same job with fewer columns, the exception:
>
> java.lang.IllegalArgumentException:
Hi,
I'd like to append a column of a dataframe to another DF (using Spark
1.5.2):
DataFrame outputDF = unlabelledDF.withColumn("predicted_label",
predictedDF.col("predicted"));
I get the following exception:
java.lang.IllegalArgumentException: requirement failed: DataFrame must have
the same