[jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed

2014-08-01 Thread Matei Zaharia (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matei Zaharia updated SPARK-2670:
-

Priority: Major  (was: Critical)

> FetchFailedException should be thrown when local fetch has failed
> -
>
> Key: SPARK-2670
> URL: https://issues.apache.org/jira/browse/SPARK-2670
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Kousuke Saruta
>Assignee: Kousuke Saruta
> Fix For: 1.1.0
>
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult 
> which size is -1 is set to results.
> {code}
>case None => {
>   logError("Could not get block(s) from " + cmId)
>   for ((blockId, size) <- req.blocks) {
> results.put(new FetchResult(blockId, -1, null))
>   }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws 
> FetchFailedException so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed

2014-08-01 Thread Matei Zaharia (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matei Zaharia updated SPARK-2670:
-

Assignee: Kousuke Saruta

> FetchFailedException should be thrown when local fetch has failed
> -
>
> Key: SPARK-2670
> URL: https://issues.apache.org/jira/browse/SPARK-2670
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Kousuke Saruta
>Assignee: Kousuke Saruta
>Priority: Critical
> Fix For: 1.1.0
>
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult 
> which size is -1 is set to results.
> {code}
>case None => {
>   logError("Could not get block(s) from " + cmId)
>   for ((blockId, size) <- req.blocks) {
> results.put(new FetchResult(blockId, -1, null))
>   }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws 
> FetchFailedException so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed

2014-07-24 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-2670:
---

Priority: Critical  (was: Major)

> FetchFailedException should be thrown when local fetch has failed
> -
>
> Key: SPARK-2670
> URL: https://issues.apache.org/jira/browse/SPARK-2670
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Kousuke Saruta
>Priority: Critical
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult 
> which size is -1 is set to results.
> {code}
>case None => {
>   logError("Could not get block(s) from " + cmId)
>   for ((blockId, size) <- req.blocks) {
> results.put(new FetchResult(blockId, -1, null))
>   }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws 
> FetchFailedException so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed

2014-07-24 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-2670:
---

Component/s: Spark Core

> FetchFailedException should be thrown when local fetch has failed
> -
>
> Key: SPARK-2670
> URL: https://issues.apache.org/jira/browse/SPARK-2670
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Kousuke Saruta
>Priority: Critical
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult 
> which size is -1 is set to results.
> {code}
>case None => {
>   logError("Could not get block(s) from " + cmId)
>   for ((blockId, size) <- req.blocks) {
> results.put(new FetchResult(blockId, -1, null))
>   }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws 
> FetchFailedException so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed

2014-07-24 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-2670:
---

Target Version/s: 1.1.0

> FetchFailedException should be thrown when local fetch has failed
> -
>
> Key: SPARK-2670
> URL: https://issues.apache.org/jira/browse/SPARK-2670
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Kousuke Saruta
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult 
> which size is -1 is set to results.
> {code}
>case None => {
>   logError("Could not get block(s) from " + cmId)
>   for ((blockId, size) <- req.blocks) {
> results.put(new FetchResult(blockId, -1, null))
>   }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws 
> FetchFailedException so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)