[jira] [Assigned] (SPARK-21181) Suppress memory leak errors reported by netty

2017-06-22 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-21181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-21181:


Assignee: (was: Apache Spark)

> Suppress memory leak errors reported by netty
> -
>
> Key: SPARK-21181
> URL: https://issues.apache.org/jira/browse/SPARK-21181
> Project: Spark
>  Issue Type: Bug
>  Components: Input/Output
>Affects Versions: 2.1.0
>Reporter: Dhruve Ashar
>Priority: Minor
>
> We are seeing netty report memory leak erros like the one below after 
> switching to 2.1. 
> {code}
> ERROR ResourceLeakDetector: LEAK: ByteBuf.release() was not called before 
> it's garbage-collected. Enable advanced leak reporting to find out where the 
> leak occurred. To enable advanced leak reporting, specify the JVM option 
> '-Dio.netty.leakDetection.level=advanced' or call 
> ResourceLeakDetector.setLevel() See 
> http://netty.io/wiki/reference-counted-objects.html for more information.
> {code}
> Looking a bit deeper, Spark is not leaking any memory here, but it is 
> confusing for the user to see the error message in the driver logs. 
> After enabling, '-Dio.netty.leakDetection.level=advanced', netty reveals the 
> SparkSaslServer to be the source of these leaks.
> Sample trace :https://gist.github.com/dhruve/b299ebc35aa0a185c244a0468927daf1



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-21181) Suppress memory leak errors reported by netty

2017-06-22 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-21181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-21181:


Assignee: Apache Spark

> Suppress memory leak errors reported by netty
> -
>
> Key: SPARK-21181
> URL: https://issues.apache.org/jira/browse/SPARK-21181
> Project: Spark
>  Issue Type: Bug
>  Components: Input/Output
>Affects Versions: 2.1.0
>Reporter: Dhruve Ashar
>Assignee: Apache Spark
>Priority: Minor
>
> We are seeing netty report memory leak erros like the one below after 
> switching to 2.1. 
> {code}
> ERROR ResourceLeakDetector: LEAK: ByteBuf.release() was not called before 
> it's garbage-collected. Enable advanced leak reporting to find out where the 
> leak occurred. To enable advanced leak reporting, specify the JVM option 
> '-Dio.netty.leakDetection.level=advanced' or call 
> ResourceLeakDetector.setLevel() See 
> http://netty.io/wiki/reference-counted-objects.html for more information.
> {code}
> Looking a bit deeper, Spark is not leaking any memory here, but it is 
> confusing for the user to see the error message in the driver logs. 
> After enabling, '-Dio.netty.leakDetection.level=advanced', netty reveals the 
> SparkSaslServer to be the source of these leaks.
> Sample trace :https://gist.github.com/dhruve/b299ebc35aa0a185c244a0468927daf1



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org