GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/18166

    [SPARK-20922][core] Add whitelist of classes that can be deserialized by 
the launcher.

    Blindly deserializing classes using Java serialization opens the code up to
    issues in other libraries, since just deserializing data from a stream may
    end up execution code (think readObject()).
    
    Since the launcher protocol is pretty self-contained, there's just a handful
    of classes it legitimately needs to deserialize, and they're in just two
    packages, so add a filter that throws errors if classes from any other
    package show up in the stream.
    
    This also maintains backwards compatibility (the updated launcher code can
    still communicate with the backend code in older Spark releases).
    
    Tested with new and existing unit tests.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark SPARK-20922

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/18166.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #18166
    
----
commit b3aba60f32022bf57728800fd9e9a21ac7198813
Author: Marcelo Vanzin <van...@cloudera.com>
Date:   2017-05-31T17:47:13Z

    [SPARK-20922][core] Add whitelist of classes that can be deserialized by 
launcher.
    
    Blindly deserializing classes using Java serialization opens the code up to
    issues in other libraries, since just deserializing data from a stream may
    end up execution code (think readObject()).
    
    Since the launcher protocol is pretty self-contained, there's just a handful
    of classes it legitimately needs to deserialize, and they're in just two
    packages, so add a filter that throws errors if classes from any other
    package show up in the stream.
    
    This also maintains backwards compatibility (the updated launcher code can
    still communicate with the backend code in older Spark releases).

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to