[jira] [Issue Comment Deleted] (SPARK-3093) masterLock in Worker is no longer need

2014-08-17 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-3093?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-3093:
-

Comment: was deleted

(was: PR available at https://github.com/apache/spark/pull/2008)

> masterLock in Worker is no longer need
> --
>
> Key: SPARK-3093
> URL: https://issues.apache.org/jira/browse/SPARK-3093
> Project: Spark
>  Issue Type: Improvement
>Reporter: Chen Chao
>
> there's no need to use masterLock in Worker now since all communications are 
> within Akka actor



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-3093) masterLock in Worker is no longer need

2014-08-17 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3093?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14100230#comment-14100230
 ] 

Chen Chao commented on SPARK-3093:
--

PR available at https://github.com/apache/spark/pull/2008

> masterLock in Worker is no longer need
> --
>
> Key: SPARK-3093
> URL: https://issues.apache.org/jira/browse/SPARK-3093
> Project: Spark
>  Issue Type: Improvement
>Reporter: Chen Chao
>
> there's no need to use masterLock in Worker now since all communications are 
> within Akka actor



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-3093) masterLock in Worker is no longer need

2014-08-17 Thread Chen Chao (JIRA)
Chen Chao created SPARK-3093:


 Summary: masterLock in Worker is no longer need
 Key: SPARK-3093
 URL: https://issues.apache.org/jira/browse/SPARK-3093
 Project: Spark
  Issue Type: Improvement
Reporter: Chen Chao


there's no need to use masterLock in Worker now since all communications are 
within Akka actor



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-07-29 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao closed SPARK-2000.


Resolution: Not a Problem

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-07-29 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14078738#comment-14078738
 ] 

Chen Chao commented on SPARK-2000:
--

[~pwendell] We can close this issue I think. There's no problem with the code, 
we just need to modify the guide.
I'll send another patch to fix the doc. 


> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2400) config spark.yarn.max.executor.failures is not explained accurately

2014-07-07 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14054542#comment-14054542
 ] 

Chen Chao commented on SPARK-2400:
--

PR @ https://github.com/apache/spark/pull/1282/files

> config spark.yarn.max.executor.failures is not explained accurately
> ---
>
> Key: SPARK-2400
> URL: https://issues.apache.org/jira/browse/SPARK-2400
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Reporter: Chen Chao
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2400) config spark.yarn.max.executor.failures is not explained accurately

2014-07-07 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14054540#comment-14054540
 ] 

Chen Chao commented on SPARK-2400:
--

it should be "numExecutors * 2, with minimum of 3" rather than "2*numExecutors".


> config spark.yarn.max.executor.failures is not explained accurately
> ---
>
> Key: SPARK-2400
> URL: https://issues.apache.org/jira/browse/SPARK-2400
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Reporter: Chen Chao
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-2400) config spark.yarn.max.executor.failures is not explained accurately

2014-07-07 Thread Chen Chao (JIRA)
Chen Chao created SPARK-2400:


 Summary: config spark.yarn.max.executor.failures is not explained 
accurately
 Key: SPARK-2400
 URL: https://issues.apache.org/jira/browse/SPARK-2400
 Project: Spark
  Issue Type: Bug
  Components: YARN
Reporter: Chen Chao
Priority: Minor






--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2031) DAGScheduler supports pluggable clock

2014-06-10 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14027311#comment-14027311
 ] 

Chen Chao commented on SPARK-2031:
--

Already resolved. please close.

> DAGScheduler supports pluggable clock
> -
>
> Key: SPARK-2031
> URL: https://issues.apache.org/jira/browse/SPARK-2031
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 0.9.1, 1.0.0
>Reporter: Chen Chao
>
> DAGScheduler supports pluggable clock like what TaskSetManager does. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-09 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14026052#comment-14026052
 ] 

Chen Chao commented on SPARK-2000:
--

Hi, Patrick, I just thought it was the same problem to 
https://issues.apache.org/jira/browse/SPARK-1028 .
Anyway, if u think it is not necessary , please close the issue : )

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2031) DAGScheduler supports pluggable clock

2014-06-04 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14018533#comment-14018533
 ] 

Chen Chao commented on SPARK-2031:
--

PR https://github.com/apache/spark/pull/976

> DAGScheduler supports pluggable clock
> -
>
> Key: SPARK-2031
> URL: https://issues.apache.org/jira/browse/SPARK-2031
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 0.9.1, 1.0.0
>Reporter: Chen Chao
>
> DAGScheduler supports pluggable clock like what TaskSetManager does. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-2031) DAGScheduler supports pluggable clock

2014-06-04 Thread Chen Chao (JIRA)
Chen Chao created SPARK-2031:


 Summary: DAGScheduler supports pluggable clock
 Key: SPARK-2031
 URL: https://issues.apache.org/jira/browse/SPARK-2031
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 1.0.0, 0.9.1
Reporter: Chen Chao


DAGScheduler supports pluggable clock like what TaskSetManager does. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Issue Comment Deleted] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-04 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-1999:
-

Comment: was deleted

(was: I have fixed and tested fine. Please assign it to me , I will post a PR 
soon!)

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-04 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14017519#comment-14017519
 ] 

Chen Chao commented on SPARK-1999:
--

PR:https://github.com/apache/spark/pull/968

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Issue Comment Deleted] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-04 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-1999:
-

Comment: was deleted

(was: https://github.com/apache/spark/pull/950
sorry,i will repost soon, the above link will be invalid.)

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016895#comment-14016895
 ] 

Chen Chao commented on SPARK-1999:
--

I have fixed and tested fine. Please assign it to me , I will post a PR soon!

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016385#comment-14016385
 ] 

Chen Chao edited comment on SPARK-1999 at 6/3/14 3:41 PM:
--

https://github.com/apache/spark/pull/950
sorry,i will repost soon, the above link will be invalid.


was (Author: crazyjvm):
https://github.com/apache/spark/pull/950

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-2000:
-

Priority: Major  (was: Blocker)

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Guoqiang Li
>  Labels: shell
> Fix For: 1.0.1, 1.1.0
>
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016634#comment-14016634
 ] 

Chen Chao edited comment on SPARK-2000 at 6/3/14 3:33 PM:
--

actually , I have already posted PR : https://github.com/apache/spark/pull/952
so please assign it to me.


was (Author: crazyjvm):
actually , I have already posted PR : https://github.com/apache/spark/pull/952

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Guoqiang Li
>Priority: Blocker
>  Labels: shell
> Fix For: 1.0.1, 1.1.0
>
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016634#comment-14016634
 ] 

Chen Chao commented on SPARK-2000:
--

actually , I have already posted PR : https://github.com/apache/spark/pull/952

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>Assignee: Guoqiang Li
>Priority: Blocker
>  Labels: shell
> Fix For: 1.0.1, 1.1.0
>
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016602#comment-14016602
 ] 

Chen Chao edited comment on SPARK-2000 at 6/3/14 3:24 PM:
--

you just run spark-shell and can connect to the cluster? i mean without srtting 
--master option


was (Author: crazyjvm):
you just run spark-shell and can connect to the cluster?

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016602#comment-14016602
 ] 

Chen Chao edited comment on SPARK-2000 at 6/3/14 3:24 PM:
--

you just run spark-shell and can connect to the cluster? i mean without setting 
--master option


was (Author: crazyjvm):
you just run spark-shell and can connect to the cluster? i mean without srtting 
--master option

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016602#comment-14016602
 ] 

Chen Chao commented on SPARK-2000:
--

you just run spark-shell and can connect to the cluster?

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016592#comment-14016592
 ] 

Chen Chao edited comment on SPARK-2000 at 6/3/14 3:11 PM:
--

just run spark-shell in standalone model on one the of cluster node without 
--master option, and you will find it will not connect to cluster. According to 
the document and previous editions, it normally should connect to the cluster 
even though there's no --master option.

please refer to http://spark.apache.org/docs/latest/spark-standalone.html
"Note that if you are running spark-shell from one of the spark cluster 
machines, the bin/spark-shell script will automatically set MASTER from the 
SPARK_MASTER_IP and SPARK_MASTER_PORT variables in conf/spark-env.sh."


was (Author: crazyjvm):
just run spark-shell in standalone model on one the of cluster node, and you 
will find it will not connect to cluster.

please refer to http://spark.apache.org/docs/latest/spark-standalone.html
"Note that if you are running spark-shell from one of the spark cluster 
machines, the bin/spark-shell script will automatically set MASTER from the 
SPARK_MASTER_IP and SPARK_MASTER_PORT variables in conf/spark-env.sh."

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016592#comment-14016592
 ] 

Chen Chao edited comment on SPARK-2000 at 6/3/14 3:09 PM:
--

just run spark-shell in standalone model on one the of cluster node, and you 
will find it will not connect to cluster.

please refer to http://spark.apache.org/docs/latest/spark-standalone.html
"Note that if you are running spark-shell from one of the spark cluster 
machines, the bin/spark-shell script will automatically set MASTER from the 
SPARK_MASTER_IP and SPARK_MASTER_PORT variables in conf/spark-env.sh."


was (Author: crazyjvm):
just run spark-shell in standalone model on one the of cluster node, and you 
will find it will not connect to cluster.

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node without specify master

2014-06-03 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-2000:
-

Summary: cannot connect to cluster in Standalone mode when run spark-shell 
in one of the cluster node without specify master  (was: cannot connect to 
cluster in Standalone mode when run spark-shell in one of the cluster node)

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node without specify master
> ---
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016592#comment-14016592
 ] 

Chen Chao commented on SPARK-2000:
--

just run spark-shell in standalone model on one the of cluster node, and you 
will find it will not connect to cluster.

> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node
> 
>
> Key: SPARK-2000
> URL: https://issues.apache.org/jira/browse/SPARK-2000
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>  Labels: shell
>
> cannot connect to cluster in Standalone mode when run spark-shell in one of 
> the cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-2000) cannot connect to cluster in Standalone mode when run spark-shell in one of the cluster node

2014-06-03 Thread Chen Chao (JIRA)
Chen Chao created SPARK-2000:


 Summary: cannot connect to cluster in Standalone mode when run 
spark-shell in one of the cluster node
 Key: SPARK-2000
 URL: https://issues.apache.org/jira/browse/SPARK-2000
 Project: Spark
  Issue Type: Bug
  Components: Deploy
Affects Versions: 1.0.0
Reporter: Chen Chao


cannot connect to cluster in Standalone mode when run spark-shell in one of the 
cluster node.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016385#comment-14016385
 ] 

Chen Chao commented on SPARK-1999:
--

PR : https://issues.apache.org/jira/browse/SPARK-1999

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Comment Edited] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-03 Thread Chen Chao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016385#comment-14016385
 ] 

Chen Chao edited comment on SPARK-1999 at 6/3/14 11:16 AM:
---

https://github.com/apache/spark/pull/950


was (Author: crazyjvm):
PR : https://issues.apache.org/jira/browse/SPARK-1999

> UI : StorageLevel in storage tab and RDD Storage Info never changes 
> 
>
> Key: SPARK-1999
> URL: https://issues.apache.org/jira/browse/SPARK-1999
> Project: Spark
>  Issue Type: Bug
>  Components: Web UI
>Affects Versions: 1.0.0
>Reporter: Chen Chao
>
> StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if 
> you call rdd.unpersist() and then you give the rdd another different storage 
> level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-1999) UI : StorageLevel in storage tab and RDD Storage Info never changes

2014-06-03 Thread Chen Chao (JIRA)
Chen Chao created SPARK-1999:


 Summary: UI : StorageLevel in storage tab and RDD Storage Info 
never changes 
 Key: SPARK-1999
 URL: https://issues.apache.org/jira/browse/SPARK-1999
 Project: Spark
  Issue Type: Bug
  Components: Web UI
Affects Versions: 1.0.0
Reporter: Chen Chao


StorageLevel in 'storage tab' and 'RDD Storage Info' never changes even if you 
call rdd.unpersist() and then you give the rdd another different storage level.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-1913) column pruning problem of Parquet File

2014-05-23 Thread Chen Chao (JIRA)
Chen Chao created SPARK-1913:


 Summary: column pruning problem of Parquet  File
 Key: SPARK-1913
 URL: https://issues.apache.org/jira/browse/SPARK-1913
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.0.0
 Environment: mac os 10.9.2
Reporter: Chen Chao


case class Person(name: String, age: Int)

if we use Parquet file, the following statement will throw a exception says  
java.lang.IllegalArgumentException: Column age does not exist.
sql("SELECT name  FROM parquetFile WHERE age >= 13 AND age <= 19") 
we have to add age column after SELECT in order to make it right:
sql("SELECT name , age  FROM parquetFile WHERE age >= 13 AND age <= 19") 

The same error also occurs when we use DSL:
 parquetFile.where('key === 1).select('value as 'a).collect().foreach(println)
will tell us can not find column 'key',we have to fix like this : 
 parquetFile.where('key === 1).select('key ,'value as 
'a).collect().foreach(println)

Obviously, that's not the way we want!



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-1627) Spark SQL should better support external sorting

2014-04-25 Thread Chen Chao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chen Chao updated SPARK-1627:
-

Description: Spark SQL Aggregator does not support external sorting now 
which is extremely important when data is much larger than memory.   (was: 
Spark SQL Aggregator do not support external sorting now which is extremely 
important when data is much larger than memory. )

> Spark SQL should better support external sorting
> 
>
> Key: SPARK-1627
> URL: https://issues.apache.org/jira/browse/SPARK-1627
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 1.0.0, 0.9.1
>Reporter: Chen Chao
>
> Spark SQL Aggregator does not support external sorting now which is extremely 
> important when data is much larger than memory. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (SPARK-1627) Spark SQL should better support external sorting

2014-04-25 Thread Chen Chao (JIRA)
Chen Chao created SPARK-1627:


 Summary: Spark SQL should better support external sorting
 Key: SPARK-1627
 URL: https://issues.apache.org/jira/browse/SPARK-1627
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 1.0.0, 0.9.1
Reporter: Chen Chao


Spark SQL Aggregator do not support external sorting now which is extremely 
important when data is much larger than memory. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)