[
https://issues.apache.org/jira/browse/LANG-1598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17484903#comment-17484903
]
Jens Dietrich commented on LANG-1598:
-
I might be able to help here. I have developed a tool that infers and injects
annotations. Only @Nullable annotations are inferred and added to the code (via
manipulating the ASTs), the default assumption is that both arguments and
returns are @NonNull. This is consistent with the approach used by the
[infer-eradicate checker|https://fbinfer.com/docs/next/checker-eradicate/].
The tool performs the following steps:
# tests are executed with an agent that logs null usage in arguments and
returns, the respective logs are recorded in json files.
# a lightweight bytecode analysis is then used to detect negative tests, i.e.
tests designed to trigger abnormal behaviour, recognised by using an exception
oracle. Example: a test containing
assertThrows(IllegalArgumentException.class,() -> foo(null)) would not be used
to infer a @Nullable annotation for the foo argument as this tests a
precondition that the argument must not be null.
# then @Nullable annotations are propagated to overriding (for arguments) or
overridden (for returns) methods to comply with liskovs substitution principle
(LSP).
There are a few parameters that can be adjusted:
Firstly, by default, JSR305 annotations are used, but this can be changed (to
some semantically equivalent @Nullable annotation – there is an abstraction in
the tool that defines the annotations and the dependency to be added to the
pom).
Secondly, which test cases should be used? The obvious choice is the project
itself, but it is also possible to use a wider set from downstream clients such
as other commons projects.
Also, the tool currently does not annotate fields, only method signatures. I
plan to add this feature in the future.
Finally, I wonder what kind of provenance the project would expect or require.
For instance, should there be a comment next to each inferred annotation
stating how it has been inferred? Or is a standalone report sufficient?
The tool itself is not yet open-source, for the simple reason as this may
complicate the publication process (I am an academic, but the project is
actually sponsored by a gift from Oracle Labs Australia). I could open-source
this now if this turns out to be a showstopper though.
I am keen to get some feedback. I thought it might be better to have a
discussion before creating a PR out of the blue.
> Use JSR-305 (javax.annotation) for Null-Analysis to avoid unexpected
> NullPointerExceptions
> --
>
> Key: LANG-1598
> URL: https://issues.apache.org/jira/browse/LANG-1598
> Project: Commons Lang
> Issue Type: Improvement
> Components: lang.*
>Affects Versions: 3.11
>Reporter: Alexander Guril
>Priority: Major
> Labels: newbie
> Time Spent: 20m
> Remaining Estimate: 0h
>
> Use the javax.annotation-API for Null-Analysis to avoid NPEs.
> {code:XML}
>
>
> com.google.code.findbugs
> jsr305
> 3.0.2
>
> {code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)