Not knowing how the code that handles those arguments look like, I
would, in the "Arguments" field for submitting a dataproc job, put:
--trainFile=gs://Anahita/small_train.dat
--testFile=gs://Anahita/small_test.dat
--numFeatures=9947
--numRounds=100
... providing you still keep those files in
Dear friends,
I am trying to run a run a spark code on Google cloud using submit job.
https://cloud.google.com/dataproc/docs/tutorials/spark-scala
My question is about the part "argument".
In my spark code, they are some variables that their values are defined in
a shell file (.sh), as