tar: Error is not recoverable: exiting now
mv: missing destination file operand after `spark'
Try `mv --help' for more information.
--
[image: Branch] <https://branch.io/?bmp=xink-sig>
Augustus Hong
Software Engineer
L$2.apply(Utility.scala:256)
> at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> at scala.xml.Utility$.sequenceToXML(Utility.scala:256)
> at scala.xml.Utility$.serialize(Utility.scala:227)
> at s
luster in place, you'll probably have to do
>> that manually. Otherwise, perhaps spark-ec2 is not the right tool, and
>> instead you want one of those "grown-up" management tools like Ansible
>> which can be setup to allow in-place upgrades. That'll take a bit of work,
&g
to spin up a new cluster every time we need to upgrade.
Thanks,
Augustus
--
[image: Branch Metrics mobile deep linking] <http://branch.io/>* Augustus
Hong*
Data Analytics | Branch Metrics
m 650-391-3369 | e augus...@branch.io
ocs/latest/submitting-applications.html
>>
>> # Run on a Spark standalone cluster in client deploy mode
>> ./bin/spark-submit \
>> --class org.apache.spark.examples.SparkPi \
>> --master spark://207.184.161.138:7077 \
>> --executor-memory 20G \
>> *-
only one job will be run, even
though there are a lot of idle cores.
Best,
Augustus
--
[image: Branch Metrics mobile deep linking] <http://branch.io/>* Augustus
Hong*
Data Analytics | Branch Metrics
m 650-391-3369 | e augus...@branch.io
branch.io/>* Augustus
Hong*
Data Analytics | Branch Metrics
m 650-391-3369 | e augus...@branch.io