Re: Set up a Coverity scan for Spark

2016-03-04 Thread Sean Owen
No. Those are all in Java examples, and while we should show stopping
the context, it has no big impact. It's worth touching up.

I'm concerned about the ones with a potential correctness implication.
They are easy to fix and already identified; why wouldn't we fix them?
we take PRs to fix typos in comments.

On Fri, Mar 4, 2016 at 3:36 PM, Ted Yu  wrote:
> Is there JIRA for fixing the resource leaks w.r.t. unclosed SparkContext ?
>
> I wonder if such defects are really high priority.
>
> Cheers
>
> On Fri, Mar 4, 2016 at 7:06 AM, Sean Owen  wrote:
>>
>> Hi Ted, I've already marked them. You should be able to see the ones
>> marked "Fix Required" if you click through to the defects. Most are
>> just bad form and probably have no impact. The few that looked
>> reasonably important were:
>>
>> - using platform char encoding, not UTF-8
>> - Incorrect notify/wait
>> - volatile count with non-atomic update
>> - bad equals/hashCode
>>
>> On Fri, Mar 4, 2016 at 2:52 PM, Ted Yu  wrote:
>> > Last time I checked there wasn't high impact defects.
>> >
>> > Mind pointing out the defects you think should be fixed ?
>> >
>> > Thanks
>> >
>> > On Fri, Mar 4, 2016 at 4:35 AM, Sean Owen  wrote:
>> >>
>> >> Yeah, it's not going to help with Scala, but it can at least find
>> >> stuff in the Java code. I'm not suggesting anyone run it regularly,
>> >> but one run to catch some bugs is useful.
>> >>
>> >> I've already triaged ~70 issues there just in the Java code, of which
>> >> a handful are important.
>> >>
>> >> On Fri, Mar 4, 2016 at 12:18 PM, Ted Yu  wrote:
>> >> > Since majority of code is written in Scala which is not analyzed by
>> >> > Coverity, the efficacy of the tool seems limited.
>> >> >
>> >> >> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
>> >> >>
>> >> >>
>> >> >>
>> >> >> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
>> >> >>
>> >> >> This has to be run manually, and is Java-only, but the inspection
>> >> >> results are pretty good. Anyone should be able to browse them, and
>> >> >> let
>> >> >> me know if anyone would like more access.
>> >> >> Most are false-positives, but it's found some reasonable little
>> >> >> bugs.
>> >> >>
>> >> >> When my stack of things to do clears I'll try to address them, but I
>> >> >> bring it up as an FYI for anyone interested in static analysis.
>> >> >>
>> >> >>
>> >> >> -
>> >> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> >> For additional commands, e-mail: dev-h...@spark.apache.org
>> >> >>
>> >
>> >
>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Set up a Coverity scan for Spark

2016-03-04 Thread Ted Yu
Is there JIRA for fixing the resource leaks w.r.t. unclosed SparkContext ?

I wonder if such defects are really high priority.

Cheers

On Fri, Mar 4, 2016 at 7:06 AM, Sean Owen  wrote:

> Hi Ted, I've already marked them. You should be able to see the ones
> marked "Fix Required" if you click through to the defects. Most are
> just bad form and probably have no impact. The few that looked
> reasonably important were:
>
> - using platform char encoding, not UTF-8
> - Incorrect notify/wait
> - volatile count with non-atomic update
> - bad equals/hashCode
>
> On Fri, Mar 4, 2016 at 2:52 PM, Ted Yu  wrote:
> > Last time I checked there wasn't high impact defects.
> >
> > Mind pointing out the defects you think should be fixed ?
> >
> > Thanks
> >
> > On Fri, Mar 4, 2016 at 4:35 AM, Sean Owen  wrote:
> >>
> >> Yeah, it's not going to help with Scala, but it can at least find
> >> stuff in the Java code. I'm not suggesting anyone run it regularly,
> >> but one run to catch some bugs is useful.
> >>
> >> I've already triaged ~70 issues there just in the Java code, of which
> >> a handful are important.
> >>
> >> On Fri, Mar 4, 2016 at 12:18 PM, Ted Yu  wrote:
> >> > Since majority of code is written in Scala which is not analyzed by
> >> > Coverity, the efficacy of the tool seems limited.
> >> >
> >> >> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
> >> >>
> >> >>
> >> >>
> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
> >> >>
> >> >> This has to be run manually, and is Java-only, but the inspection
> >> >> results are pretty good. Anyone should be able to browse them, and
> let
> >> >> me know if anyone would like more access.
> >> >> Most are false-positives, but it's found some reasonable little bugs.
> >> >>
> >> >> When my stack of things to do clears I'll try to address them, but I
> >> >> bring it up as an FYI for anyone interested in static analysis.
> >> >>
> >> >> -
> >> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> >> For additional commands, e-mail: dev-h...@spark.apache.org
> >> >>
> >
> >
>


Re: Set up a Coverity scan for Spark

2016-03-04 Thread Sean Owen
Hi Ted, I've already marked them. You should be able to see the ones
marked "Fix Required" if you click through to the defects. Most are
just bad form and probably have no impact. The few that looked
reasonably important were:

- using platform char encoding, not UTF-8
- Incorrect notify/wait
- volatile count with non-atomic update
- bad equals/hashCode

On Fri, Mar 4, 2016 at 2:52 PM, Ted Yu  wrote:
> Last time I checked there wasn't high impact defects.
>
> Mind pointing out the defects you think should be fixed ?
>
> Thanks
>
> On Fri, Mar 4, 2016 at 4:35 AM, Sean Owen  wrote:
>>
>> Yeah, it's not going to help with Scala, but it can at least find
>> stuff in the Java code. I'm not suggesting anyone run it regularly,
>> but one run to catch some bugs is useful.
>>
>> I've already triaged ~70 issues there just in the Java code, of which
>> a handful are important.
>>
>> On Fri, Mar 4, 2016 at 12:18 PM, Ted Yu  wrote:
>> > Since majority of code is written in Scala which is not analyzed by
>> > Coverity, the efficacy of the tool seems limited.
>> >
>> >> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
>> >>
>> >>
>> >> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
>> >>
>> >> This has to be run manually, and is Java-only, but the inspection
>> >> results are pretty good. Anyone should be able to browse them, and let
>> >> me know if anyone would like more access.
>> >> Most are false-positives, but it's found some reasonable little bugs.
>> >>
>> >> When my stack of things to do clears I'll try to address them, but I
>> >> bring it up as an FYI for anyone interested in static analysis.
>> >>
>> >> -
>> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: dev-h...@spark.apache.org
>> >>
>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Set up a Coverity scan for Spark

2016-03-04 Thread Ted Yu
Last time I checked there wasn't high impact defects.

Mind pointing out the defects you think should be fixed ?

Thanks

On Fri, Mar 4, 2016 at 4:35 AM, Sean Owen  wrote:

> Yeah, it's not going to help with Scala, but it can at least find
> stuff in the Java code. I'm not suggesting anyone run it regularly,
> but one run to catch some bugs is useful.
>
> I've already triaged ~70 issues there just in the Java code, of which
> a handful are important.
>
> On Fri, Mar 4, 2016 at 12:18 PM, Ted Yu  wrote:
> > Since majority of code is written in Scala which is not analyzed by
> Coverity, the efficacy of the tool seems limited.
> >
> >> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
> >>
> >>
> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
> >>
> >> This has to be run manually, and is Java-only, but the inspection
> >> results are pretty good. Anyone should be able to browse them, and let
> >> me know if anyone would like more access.
> >> Most are false-positives, but it's found some reasonable little bugs.
> >>
> >> When my stack of things to do clears I'll try to address them, but I
> >> bring it up as an FYI for anyone interested in static analysis.
> >>
> >> -
> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: dev-h...@spark.apache.org
> >>
>


Re: Set up a Coverity scan for Spark

2016-03-04 Thread Sean Owen
Yeah, it's not going to help with Scala, but it can at least find
stuff in the Java code. I'm not suggesting anyone run it regularly,
but one run to catch some bugs is useful.

I've already triaged ~70 issues there just in the Java code, of which
a handful are important.

On Fri, Mar 4, 2016 at 12:18 PM, Ted Yu  wrote:
> Since majority of code is written in Scala which is not analyzed by Coverity, 
> the efficacy of the tool seems limited.
>
>> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
>>
>> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
>>
>> This has to be run manually, and is Java-only, but the inspection
>> results are pretty good. Anyone should be able to browse them, and let
>> me know if anyone would like more access.
>> Most are false-positives, but it's found some reasonable little bugs.
>>
>> When my stack of things to do clears I'll try to address them, but I
>> bring it up as an FYI for anyone interested in static analysis.
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Set up a Coverity scan for Spark

2016-03-04 Thread Ted Yu
Since majority of code is written in Scala which is not analyzed by Coverity, 
the efficacy of the tool seems limited. 

> On Mar 4, 2016, at 2:34 AM, Sean Owen  wrote:
> 
> https://scan.coverity.com/projects/apache-spark-2f9d080d-401d-47bc-9dd1-7956c411fbb4?tab=overview
> 
> This has to be run manually, and is Java-only, but the inspection
> results are pretty good. Anyone should be able to browse them, and let
> me know if anyone would like more access.
> Most are false-positives, but it's found some reasonable little bugs.
> 
> When my stack of things to do clears I'll try to address them, but I
> bring it up as an FYI for anyone interested in static analysis.
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org