Well that is what the OP stated.

I have a spark cluster consisting of 4 nodes in a standalone mode,..........

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 5 July 2016 at 19:24, Michael Segel <msegel_had...@hotmail.com> wrote:

> Did the OP say he was running a stand alone cluster of Spark, or on Yarn?
>
>
> On Jul 5, 2016, at 10:22 AM, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>
> Hi Jakub,
>
> Any reason why you are running in standalone mode, given that your are
> familiar with YARN?
>
> In theory your settings are correct. I checked your environment tab
> settings and they look correct.
>
> I assume you have checked this link
>
> http://spark.apache.org/docs/latest/spark-standalone.html
>
> BTW is this issue confined to ML or any other Spark application exhibits
> the same behaviour in standalone mode?
>
>
> HTH
>
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
> http://talebzadehmich.wordpress.com
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 5 July 2016 at 11:17, Jacek Laskowski <ja...@japila.pl> wrote:
>
>> Hi Jakub,
>>
>> You're correct - spark.master    spark://master.clust:7077 - proves your
>> point. You're running Spark Standalone that was set in
>> conf/spark-defaults.conf perhaps.
>>
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>> On Tue, Jul 5, 2016 at 12:04 PM, Jakub Stransky <stransky...@gmail.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I am convinced that we are not running in local mode:
>>>
>>> Runtime Information
>>>
>>> Name    Value
>>> Java Home    /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre
>>> Java Version    1.7.0_65 (Oracle Corporation)
>>> Scala Version    version 2.10.5
>>> Spark Properties
>>>
>>> Name    Value
>>> spark.app.id    app-20160704121044-0003
>>> spark.app.name    DemoApp
>>> spark.driver.extraClassPath    /home/sparkuser/sqljdbc4.jar
>>> spark.driver.host    10.2.0.4
>>> spark.driver.memory    4g
>>> spark.driver.port    59493
>>> spark.executor.extraClassPath    /usr/local/spark-1.6.1/sqljdbc4.jar
>>> spark.executor.id    driver
>>> spark.executor.memory    12g
>>> spark.externalBlockStore.folderName
>>>  spark-5630dd34-4267-462e-882e-b382832bb500
>>> spark.jars    file:/home/sparkuser/SparkPOC.jar
>>> spark.master    spark://master.clust:7077
>>> spark.scheduler.mode    FIFO
>>> spark.submit.deployMode    client
>>> System Properties
>>>
>>> Name    Value
>>> SPARK_SUBMIT    true
>>> awt.toolkit    sun.awt.X11.XToolkit
>>> file.encoding    UTF-8
>>> file.encoding.pkg    sun.io
>>> file.separator    /
>>> java.awt.graphicsenv    sun.awt.X11GraphicsEnvironment
>>> java.awt.printerjob    sun.print.PSPrinterJob
>>> java.class.version    51.0
>>> java.endorsed.dirs
>>>  /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/endorsed
>>> java.ext.dirs
>>>  
>>> /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/ext:/usr/java/packages/lib/ext
>>> java.home    /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre
>>> java.io.tmpdir    /tmp
>>> java.library.path
>>>  /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>>> java.runtime.name    OpenJDK Runtime Environment
>>> java.runtime.version    1.7.0_65-mockbuild_2014_07_16_06_06-b00
>>> java.specification.name    Java Platform API Specification
>>> java.specification.vendor    Oracle Corporation
>>> java.specification.version    1.7
>>> java.vendor    Oracle Corporation
>>> java.vendor.url    http://java.oracle.com/
>>> java.vendor.url.bug    http://bugreport.sun.com/bugreport/
>>> java.version    1.7.0_65
>>> java.vm.info    mixed mode
>>> java.vm.name    OpenJDK 64-Bit Server VM
>>> java.vm.specification.name    Java Virtual Machine Specification
>>> java.vm.specification.vendor    Oracle Corporation
>>> java.vm.specification.version    1.7
>>> java.vm.vendor    Oracle Corporation
>>> java.vm.version    24.65-b04
>>> line.separator
>>> os.arch    amd64
>>> os.name    Linux
>>> os.version    2.6.32-431.29.2.el6.x86_64
>>> path.separator    :
>>> sun.arch.data.model    64
>>> sun.boot.class.path
>>>  
>>> /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/resources.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/rt.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/jsse.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/jce.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/charsets.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/rhino.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/jfr.jar:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/classes
>>> sun.boot.library.path
>>>  /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.65.x86_64/jre/lib/amd64
>>> sun.cpu.endian    little
>>> sun.cpu.isalist
>>> sun.io.unicode.encoding    UnicodeLittle
>>> sun.java.command    org.apache.spark.deploy.SparkSubmit --conf
>>> spark.driver.extraClassPath=/home/sparkuser/sqljdbc4.jar --class  --class
>>> DemoApp SparkPOC.jar 10 4.3
>>> sun.java.launcher    SUN_STANDARD
>>> sun.jnu.encoding    UTF-8
>>> sun.management.compiler    HotSpot 64-Bit Tiered Compilers
>>> sun.nio.ch.bugLevel
>>> sun.os.patch.level    unknown
>>> user.country    US
>>> user.dir    /home/sparkuser
>>> user.home    /home/sparkuser
>>> user.language    en
>>> user.name    sparkuser
>>> user.timezone    Etc/UTC
>>> Classpath Entries
>>>
>>> Resource    Source
>>> /home/sparkuser/sqljdbc4.jar    System Classpath
>>> /usr/local/spark-1.6.1/assembly/target/scala-2.10/spark-assembly-1.6.1-hadoop2.2.0.jar
>>>    System Classpath
>>> /usr/local/spark-1.6.1/conf/    System Classpath
>>> /usr/local/spark-1.6.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
>>>  System Classpath
>>> /usr/local/spark-1.6.1/lib_managed/jars/datanucleus-core-3.2.10.jar
>>>  System Classpath
>>> /usr/local/spark-1.6.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar
>>>  System Classpath
>>> http://10.2.0.4:35639/jars/SparkPOC.jar    Added By User
>>>
>>> On 4 July 2016 at 21:43, Mich Talebzadeh <mich.talebza...@gmail.com>
>>> wrote:
>>>
>>>> well this will be apparent from the Environment tab of GUI. It will
>>>> show how the job is actually running.
>>>>
>>>> Jacek's point is correct. I suspect this is actually running in Local
>>>> mode as it looks consuming all from the master node.
>>>>
>>>> HTH
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Dr Mich Talebzadeh
>>>>
>>>>
>>>> LinkedIn * 
>>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>
>>>>
>>>> http://talebzadehmich.wordpress.com
>>>>
>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>> any loss, damage or destruction of data or any other property which may
>>>> arise from relying on this email's technical content is explicitly
>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>> arising from such loss, damage or destruction.
>>>>
>>>>
>>>>
>>>> On 4 July 2016 at 20:35, Jacek Laskowski <ja...@japila.pl> wrote:
>>>>
>>>>> On Mon, Jul 4, 2016 at 8:36 PM, Mathieu Longtin <
>>>>> math...@closetwork.org> wrote:
>>>>>
>>>>>> Are you using a --master argument, or equivalent config, when calling
>>>>>> spark-submit?
>>>>>>
>>>>>> If you don't, it runs in standalone mode.
>>>>>>
>>>>>
>>>>> s/standalone/local[*]
>>>>>
>>>>> Jacek
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Jakub Stransky
>>> cz.linkedin.com/in/jakubstransky
>>>
>>>
>>
>
>

Reply via email to