Kevin:
Can you describe how you got over the Metadata fetch exception ?
> On Apr 16, 2016, at 9:41 AM, Kevin Eid wrote:
>
> One last email to announce that I've fixed all of the issues. Don't hesitate
> to contact me if you encounter the same. I'd be happy to help.
>
> Regards,
> Kevin
>
>> O
One last email to announce that I've fixed all of the issues. Don't
hesitate to contact me if you encounter the same. I'd be happy to help.
Regards,
Kevin
On 14 Apr 2016 12:39 p.m., "Kevin Eid" wrote:
> Hi all,
>
> I managed to copy my .py files from local to the cluster using SCP . And I
> mana
Update:
- I managed to login to the cluster
- I want to use copy-dir to deploy my Python files to all nodes, I read I
need to copy them to /ephemeral/hdfs. But I don't know how to move them from
local to the cluster in HDFS?
Thanks in advance,
Kevin
--
View this message in context:
http://a
Hi,
Thanks for your emails, I tried running your command but it returned: "No
such file or directory".
So I definitely need to move my local .py files to the cluster, I tried
login but (before sshing) but could not find the master:
./spark-ec2 -k key -i key.pem login weather-cluster
- then sshing
Which py file is your main file (primary py file)? Zip the other two py files.
Leave the main py file alone. Don't copy them to S3 because it seems that only
local primary and additional py files are supported.
./bin/spark-submit --master spark://... --py-files
-Original Message-
From
I don't know how to do it with python, but scala has a plugin named
sbt-pack that creates an auto contained unix command with your code, no
need to use spark-submit. It should be out there something similar to this
tool.
Alonso Isidoro Roman.
Mis citas preferidas (de hoy) :
"Si depurar es el pr