Re: Hostname :BUG

2020-03-12 Thread Zahid Rahman
hey Dodgy Bob, Linux &  C programmers, conscientious non - objector,

I have a great idea I want share with you.
In linux I am familiar with wc {wc = word count} (linux users don't like
long winded typing ).
wc  flags are :
-c, --bytes print the byte counts
   -m, --chars
  print the character counts
   -l, --lines
  print the newline counts.


*zahid@192:~/Downloads> wc -w /etc/hostname55 /etc/hostname*

The first programme I was tasked to write in C was to replicate the linux
wc utility .
I called it wordcount.c with word -c -l -m or word wordcount  -c -l  /etc.

Anyway  on this page https://spark.apache.org/examples.html
there are examples of word count in scala,python and Java.

I kinda feel left out because I know a little  C and little Linux.
I think  it is great idea for the sake of "*familiarity* *for the client"*
( application developer ).
I was thinking of raising a JIRA but I thought I would consult with fellow
developers first. :)

Please be kind.

Backbutton.co.uk
¯\_(ツ)_/¯
♡۶Java♡۶RMI ♡۶
Make Use Method {MUM}
makeuse.org



On Mon, 9 Mar 2020 at 08:57, Zahid Rahman  wrote:

> Hey floyd ,
>
> I just realised something:
> You need to practice using the adduser command to create users
> or in your case useradd  because that's  less painless for you to create a
> user.
> Instead of working in root.
> Trust me it is good for you.
> Then you will realise this bit of code new SparkConf() is reading from the
> etc/hostname and not etc/host file for ip_address.
>
> Backbutton.co.uk
> ¯\_(ツ)_/¯
> ♡۶Java♡۶RMI ♡۶
> Make Use Method {MUM}
> makeuse.org
> 
>
>
> On Wed, 4 Mar 2020 at 21:14, Andrew Melo  wrote:
>
>> Hello Zabid,
>>
>> On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman  wrote:
>>
>>> Hi,
>>>
>>> I found the problem was because on my  Linux   Operating System the
>>> /etc/hostname was blank.
>>>
>>> *STEP 1*
>>> I searched  on google the error message and there was an answer
>>> suggesting
>>> I should add to /etc/hostname
>>>
>>> 127.0.0.1  [hostname] localhost.
>>>
>>
>> I believe you've confused /etc/hostname and /etc/hosts --
>>
>>
>>>
>>> I did that but there was still  an error,  this time the spark  log in
>>> standard output was concatenating the text content
>>> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>>>
>>> *STEP 2*
>>> My second attempt was to change the /etc/hostname to 127.0.0.1
>>> This time I was getting a warning with information about "using loop
>>> back"  rather than an error.
>>>
>>> *STEP 3*
>>> I wasn't happy with that so then I changed the /etc/hostname to (see
>>> below) ,
>>> then the warning message disappeared. my guess is that it is the act of
>>> creating spark session as to the cause of error,
>>> in SparkConf() API.
>>>
>>>  SparkConf sparkConf = new SparkConf()
>>>  .setAppName("Simple Application")
>>>  .setMaster("local")
>>>  .set("spark.executor.memory","2g");
>>>
>>> $ cat /etc/hostname
>>> # hosts This file describes a number of hostname-to-address
>>> #   mappings for the TCP/IP subsystem.  It is mostly
>>> #   used at boot time, when no name servers are running.
>>> #   On small systems, this file can be used instead of a
>>> #   "named" name server.
>>> # Syntax:
>>> #
>>> # IP-Address  Full-Qualified-Hostname  Short-Hostname
>>> #
>>>
>>> 192.168.0.42
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> zahid@localhost
>>> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>>> -Dexec.args="input.txt"
>>> [INFO] Scanning for projects...
>>> [WARNING]
>>> [WARNING] Some problems were encountered while building the effective
>>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>>> [WARNING] 'build.plugins.plugin.version' for
>>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>>> column 21
>>> [WARNING]
>>> [WARNING] It is highly recommended to fix these problems because they
>>> threaten the stability of your build.
>>> [WARNING]
>>> [WARNING] For this reason, future Maven versions might no longer support
>>> building such malformed projects.
>>> [WARNING]
>>> [INFO]
>>> [INFO] ---< javacodegeek:examples
>>> >
>>> [INFO] Building examples 1.0-SNAPSHOT
>>> [INFO] [ jar
>>> ]-
>>> [INFO]
>>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>>> WARNING: An illegal reflective access operation has occurred
>>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>>> to method java.nio.Bits.unaligned()
>>> WARNING: Please consider reporting this to the maintainers of
>>> org.apache.spark.unsafe.Platform
>>> WA

Re: Hostname :BUG

2020-03-09 Thread Zahid Rahman
Hey floyd ,

I just realised something:
You need to practice using the adduser command to create users
or in your case useradd  because that's  less painless for you to create a
user.
Instead of working in root.
Trust me it is good for you.
Then you will realise this bit of code new SparkConf() is reading from the
etc/hostname and not etc/host file for ip_address.

Backbutton.co.uk
¯\_(ツ)_/¯
♡۶Java♡۶RMI ♡۶
Make Use Method {MUM}
makeuse.org



On Wed, 4 Mar 2020 at 21:14, Andrew Melo  wrote:

> Hello Zabid,
>
> On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman  wrote:
>
>> Hi,
>>
>> I found the problem was because on my  Linux   Operating System the
>> /etc/hostname was blank.
>>
>> *STEP 1*
>> I searched  on google the error message and there was an answer suggesting
>> I should add to /etc/hostname
>>
>> 127.0.0.1  [hostname] localhost.
>>
>
> I believe you've confused /etc/hostname and /etc/hosts --
>
>
>>
>> I did that but there was still  an error,  this time the spark  log in
>> standard output was concatenating the text content
>> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>>
>> *STEP 2*
>> My second attempt was to change the /etc/hostname to 127.0.0.1
>> This time I was getting a warning with information about "using loop
>> back"  rather than an error.
>>
>> *STEP 3*
>> I wasn't happy with that so then I changed the /etc/hostname to (see
>> below) ,
>> then the warning message disappeared. my guess is that it is the act of
>> creating spark session as to the cause of error,
>> in SparkConf() API.
>>
>>  SparkConf sparkConf = new SparkConf()
>>  .setAppName("Simple Application")
>>  .setMaster("local")
>>  .set("spark.executor.memory","2g");
>>
>> $ cat /etc/hostname
>> # hosts This file describes a number of hostname-to-address
>> #   mappings for the TCP/IP subsystem.  It is mostly
>> #   used at boot time, when no name servers are running.
>> #   On small systems, this file can be used instead of a
>> #   "named" name server.
>> # Syntax:
>> #
>> # IP-Address  Full-Qualified-Hostname  Short-Hostname
>> #
>>
>> 192.168.0.42
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> zahid@localhost
>> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>> -Dexec.args="input.txt"
>> [INFO] Scanning for projects...
>> [WARNING]
>> [WARNING] Some problems were encountered while building the effective
>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>> [WARNING] 'build.plugins.plugin.version' for
>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>> column 21
>> [WARNING]
>> [WARNING] It is highly recommended to fix these problems because they
>> threaten the stability of your build.
>> [WARNING]
>> [WARNING] For this reason, future Maven versions might no longer support
>> building such malformed projects.
>> [WARNING]
>> [INFO]
>> [INFO] ---< javacodegeek:examples
>> >
>> [INFO] Building examples 1.0-SNAPSHOT
>> [INFO] [ jar
>> ]-
>> [INFO]
>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>> WARNING: An illegal reflective access operation has occurred
>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>> to method java.nio.Bits.unaligned()
>> WARNING: Please consider reporting this to the maintainers of
>> org.apache.spark.unsafe.Platform
>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>> reflective access operations
>> WARNING: All illegal access operations will be denied in a future release
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
>> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
>> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users  with view permissions: Set(zahid);
>> groups with view permissions: Set(); users  with modify permissions:
>> Set(zahid); groups with modify permissions: Set()
>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>> random free port. You may check whether configuring an appropriate binding
>> address.
>> 20/02/29 17:2

Re: Hostname :BUG

2020-03-05 Thread Zahid Rahman
Talking about copy and paste
Larry Tesler The *inventor* of *cut*/*copy* & *paste*, find & replace
past away last week age 74.

Backbutton.co.uk
¯\_(ツ)_/¯
♡۶Java♡۶RMI ♡۶
Make Use Method {MUM}
makeuse.org



On Thu, 5 Mar 2020 at 07:01, Zahid Rahman  wrote:

> Please explain why you think that if there is a different reason from this
> : -
>
> If you think that, because the header of /etc/hostname says hosts then
> that is because I copied the file header from /etc/hosts to  /etc/hostname.
>
>
>
>
> On Wed, 4 Mar 2020, 21:14 Andrew Melo,  wrote:
>
>> Hello Zabid,
>>
>> On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman  wrote:
>>
>>> Hi,
>>>
>>> I found the problem was because on my  Linux   Operating System the
>>> /etc/hostname was blank.
>>>
>>> *STEP 1*
>>> I searched  on google the error message and there was an answer
>>> suggesting
>>> I should add to /etc/hostname
>>>
>>> 127.0.0.1  [hostname] localhost.
>>>
>>
>> I believe you've confused /etc/hostname and /etc/hosts --
>>
>>
>>>
>>> I did that but there was still  an error,  this time the spark  log in
>>> standard output was concatenating the text content
>>> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>>>
>>> *STEP 2*
>>> My second attempt was to change the /etc/hostname to 127.0.0.1
>>> This time I was getting a warning with information about "using loop
>>> back"  rather than an error.
>>>
>>> *STEP 3*
>>> I wasn't happy with that so then I changed the /etc/hostname to (see
>>> below) ,
>>> then the warning message disappeared. my guess is that it is the act of
>>> creating spark session as to the cause of error,
>>> in SparkConf() API.
>>>
>>>  SparkConf sparkConf = new SparkConf()
>>>  .setAppName("Simple Application")
>>>  .setMaster("local")
>>>  .set("spark.executor.memory","2g");
>>>
>>> $ cat /etc/hostname
>>> # hosts This file describes a number of hostname-to-address
>>> #   mappings for the TCP/IP subsystem.  It is mostly
>>> #   used at boot time, when no name servers are running.
>>> #   On small systems, this file can be used instead of a
>>> #   "named" name server.
>>> # Syntax:
>>> #
>>> # IP-Address  Full-Qualified-Hostname  Short-Hostname
>>> #
>>>
>>> 192.168.0.42
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> zahid@localhost
>>> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>>> -Dexec.args="input.txt"
>>> [INFO] Scanning for projects...
>>> [WARNING]
>>> [WARNING] Some problems were encountered while building the effective
>>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>>> [WARNING] 'build.plugins.plugin.version' for
>>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>>> column 21
>>> [WARNING]
>>> [WARNING] It is highly recommended to fix these problems because they
>>> threaten the stability of your build.
>>> [WARNING]
>>> [WARNING] For this reason, future Maven versions might no longer support
>>> building such malformed projects.
>>> [WARNING]
>>> [INFO]
>>> [INFO] ---< javacodegeek:examples
>>> >
>>> [INFO] Building examples 1.0-SNAPSHOT
>>> [INFO] [ jar
>>> ]-
>>> [INFO]
>>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>>> WARNING: An illegal reflective access operation has occurred
>>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>>> to method java.nio.Bits.unaligned()
>>> WARNING: Please consider reporting this to the maintainers of
>>> org.apache.spark.unsafe.Platform
>>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>>> reflective access operations
>>> WARNING: All illegal access operations will be denied in a future release
>>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
>>> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
>>> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users  with view permissions: Set(zahid);
>>> groups with view permissions: Set(); users  with modify permissions:
>>> Set(zahid); groups with modify permissions: Set()
>>> 20/

Re: Hostname :BUG

2020-03-04 Thread Zahid Rahman
Please explain why you think that if there is a different reason from this
: -

If you think that, because the header of /etc/hostname says hosts then that
is because I copied the file header from /etc/hosts to  /etc/hostname.




On Wed, 4 Mar 2020, 21:14 Andrew Melo,  wrote:

> Hello Zabid,
>
> On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman  wrote:
>
>> Hi,
>>
>> I found the problem was because on my  Linux   Operating System the
>> /etc/hostname was blank.
>>
>> *STEP 1*
>> I searched  on google the error message and there was an answer suggesting
>> I should add to /etc/hostname
>>
>> 127.0.0.1  [hostname] localhost.
>>
>
> I believe you've confused /etc/hostname and /etc/hosts --
>
>
>>
>> I did that but there was still  an error,  this time the spark  log in
>> standard output was concatenating the text content
>> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>>
>> *STEP 2*
>> My second attempt was to change the /etc/hostname to 127.0.0.1
>> This time I was getting a warning with information about "using loop
>> back"  rather than an error.
>>
>> *STEP 3*
>> I wasn't happy with that so then I changed the /etc/hostname to (see
>> below) ,
>> then the warning message disappeared. my guess is that it is the act of
>> creating spark session as to the cause of error,
>> in SparkConf() API.
>>
>>  SparkConf sparkConf = new SparkConf()
>>  .setAppName("Simple Application")
>>  .setMaster("local")
>>  .set("spark.executor.memory","2g");
>>
>> $ cat /etc/hostname
>> # hosts This file describes a number of hostname-to-address
>> #   mappings for the TCP/IP subsystem.  It is mostly
>> #   used at boot time, when no name servers are running.
>> #   On small systems, this file can be used instead of a
>> #   "named" name server.
>> # Syntax:
>> #
>> # IP-Address  Full-Qualified-Hostname  Short-Hostname
>> #
>>
>> 192.168.0.42
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> zahid@localhost
>> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>> -Dexec.args="input.txt"
>> [INFO] Scanning for projects...
>> [WARNING]
>> [WARNING] Some problems were encountered while building the effective
>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>> [WARNING] 'build.plugins.plugin.version' for
>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>> column 21
>> [WARNING]
>> [WARNING] It is highly recommended to fix these problems because they
>> threaten the stability of your build.
>> [WARNING]
>> [WARNING] For this reason, future Maven versions might no longer support
>> building such malformed projects.
>> [WARNING]
>> [INFO]
>> [INFO] ---< javacodegeek:examples
>> >
>> [INFO] Building examples 1.0-SNAPSHOT
>> [INFO] [ jar
>> ]-
>> [INFO]
>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>> WARNING: An illegal reflective access operation has occurred
>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>> to method java.nio.Bits.unaligned()
>> WARNING: Please consider reporting this to the maintainers of
>> org.apache.spark.unsafe.Platform
>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>> reflective access operations
>> WARNING: All illegal access operations will be denied in a future release
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
>> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
>> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users  with view permissions: Set(zahid);
>> groups with view permissions: Set(); users  with modify permissions:
>> Set(zahid); groups with modify permissions: Set()
>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>> random free port. You may check whether configuring an appropriate binding
>> address.
>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>> random free port. You may check whether configuring an appropriate binding
>> address.
>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>> random free port. Yo

Re: Hostname :BUG

2020-03-04 Thread Andrew Melo
Hello Zabid,

On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman  wrote:

> Hi,
>
> I found the problem was because on my  Linux   Operating System the
> /etc/hostname was blank.
>
> *STEP 1*
> I searched  on google the error message and there was an answer suggesting
> I should add to /etc/hostname
>
> 127.0.0.1  [hostname] localhost.
>

I believe you've confused /etc/hostname and /etc/hosts --


>
> I did that but there was still  an error,  this time the spark  log in
> standard output was concatenating the text content
> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>
> *STEP 2*
> My second attempt was to change the /etc/hostname to 127.0.0.1
> This time I was getting a warning with information about "using loop
> back"  rather than an error.
>
> *STEP 3*
> I wasn't happy with that so then I changed the /etc/hostname to (see
> below) ,
> then the warning message disappeared. my guess is that it is the act of
> creating spark session as to the cause of error,
> in SparkConf() API.
>
>  SparkConf sparkConf = new SparkConf()
>  .setAppName("Simple Application")
>  .setMaster("local")
>  .set("spark.executor.memory","2g");
>
> $ cat /etc/hostname
> # hosts This file describes a number of hostname-to-address
> #   mappings for the TCP/IP subsystem.  It is mostly
> #   used at boot time, when no name servers are running.
> #   On small systems, this file can be used instead of a
> #   "named" name server.
> # Syntax:
> #
> # IP-Address  Full-Qualified-Hostname  Short-Hostname
> #
>
> 192.168.0.42
>
>
>
>
>
>
>
>
>
>
>
>
> zahid@localhost
> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
> -Dexec.args="input.txt"
> [INFO] Scanning for projects...
> [WARNING]
> [WARNING] Some problems were encountered while building the effective
> model for javacodegeek:examples:jar:1.0-SNAPSHOT
> [WARNING] 'build.plugins.plugin.version' for
> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
> column 21
> [WARNING]
> [WARNING] It is highly recommended to fix these problems because they
> threaten the stability of your build.
> [WARNING]
> [WARNING] For this reason, future Maven versions might no longer support
> building such malformed projects.
> [WARNING]
> [INFO]
> [INFO] ---< javacodegeek:examples
> >
> [INFO] Building examples 1.0-SNAPSHOT
> [INFO] [ jar
> ]-
> [INFO]
> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
> to method java.nio.Bits.unaligned()
> WARNING: Please consider reporting this to the maintainers of
> org.apache.spark.unsafe.Platform
> WARNING: Use --illegal-access=warn to enable warnings of further illegal
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(zahid);
> groups with view permissions: Set(); users  with modify permissions:
> Set(zahid); groups with modify permissions: Set()
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:

Hostname :BUG

2020-03-04 Thread Zahid Rahman
Hi,

I found the problem was because on my  Linux   Operating System the
/etc/hostname was blank.

*STEP 1*
I searched  on google the error message and there was an answer suggesting
I should add to /etc/hostname

127.0.0.1  [hostname] localhost.

I did that but there was still  an error,  this time the spark  log in
standard output was concatenating the text content
of etc/hostname  like so ,   127.0.0.1[hostname]localhost.

*STEP 2*
My second attempt was to change the /etc/hostname to 127.0.0.1
This time I was getting a warning with information about "using loop back"
rather than an error.

*STEP 3*
I wasn't happy with that so then I changed the /etc/hostname to (see below)
,
then the warning message disappeared. my guess is that it is the act of
creating spark session as to the cause of error,
in SparkConf() API.

 SparkConf sparkConf = new SparkConf()
 .setAppName("Simple Application")
 .setMaster("local")
 .set("spark.executor.memory","2g");

$ cat /etc/hostname
# hosts This file describes a number of hostname-to-address
#   mappings for the TCP/IP subsystem.  It is mostly
#   used at boot time, when no name servers are running.
#   On small systems, this file can be used instead of a
#   "named" name server.
# Syntax:
#
# IP-Address  Full-Qualified-Hostname  Short-Hostname
#

192.168.0.42












zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
-Dexec.args="input.txt"
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model
for javacodegeek:examples:jar:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for
org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
column 21
[WARNING]
[WARNING] It is highly recommended to fix these problems because they
threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support
building such malformed projects.
[WARNING]
[INFO]
[INFO] ---< javacodegeek:examples
>
[INFO] Building examples 1.0-SNAPSHOT
[INFO] [ jar
]-
[INFO]
[INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
(file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of
org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal
reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users  with view permissions: Set(zahid);
groups with view permissions: Set(); users  with modify permissions:
Set(zahid); groups with modify permissions: Set()
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' c