Hello!
I'm working with some parquet files saved on amazon service and loading
them to dataframe with
Dataset df = spark.read() .parquet(parketFileLocation);
however, after some time I get the "Timeout waiting for connection from
pool" exception. I hope I'm not mistaken, but I think that there's
.getOrCreate();
> List results = new LinkedList();
> JavaRDD jsonRDD =
> new JavaSparkContext(sparkSession.
> sparkContext()).parallelize(results);
>
> Dataset peopleDF = sparkSession.createDataFrame(jsonRDD,
> Row.class
Hello!
I am running Spark on Java and bumped into a problem I can't solve or find
anything helpful among answered questions, so I would really appreciate
your help.
I am running some calculations, creating rows for each result:
List results = new LinkedList();
for(something){
Hello!
I am running Spark on Java and bumped into a problem I can't solve or find
anything helpful among answered questions, so I would really appreciate
your help.
I am running some calculations, creating rows for each result:
List results = new LinkedList();
for(something){