We finally managed to find the problem, the s3 files were located in
Frankfurt which only supports the *v4* signature
*Surprising* is the fact that the spark core library method textfile does
not support that!!
--
View this message in context:
the user has to do here??? i am using key secret !!!
How can i simply create RDD from text file on S3
Thanks
Didi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536.html
Sent from the Apache Spark User List mailing list archive
bsd
I am new to the Spark Streaming and have some issues which i can't find any
documentation stuff to answer them
I believe a lot of Spark users in general and Spark Streaming in particular
use it for analysis of events by calculation of distributed large
aggregations.
In case i have to digest
Hi all
I guess the problem has something to do with the fact i submit the job to
remote location
I submit from OracleVM running ubuntu and suspect some NAT issues maybe?
akka tcp tries this address as follows from the STDERR print which is
appended akka.tcp://spark@LinuxDevVM.local:59266
STDERR