Github user caneGuy commented on the issue:

    https://github.com/apache/spark/pull/19335
  
    在spark-user list提问吧:
    http://apache-spark-user-list.1001560.n3.nabble.com/
    
    2017-09-25 11:29 GMT+08:00 listenLearning <notificati...@github.com>:
    
    > 
您好,最近我在开发的时候遇到一个问题,就是如果我用mappartitions这个api去存储数据到
    > 
hbase,会出现一个找不到partition的错误,然后跟着就会出现一个找不到广播变量的错误,请问这个是为什呢???一下是代ç
 ä»¥åŠé”™è¯¯
    > def ASpan(span: DataFrame, time: String): Unit = {
    > try {
    > span.mapPartitions(iter=>{
    > iter.map(line => {
    > val put = new Put(Bytes.toBytes(CreateRowkey.Bit16(line.getString(0)) +
    > "_101301"))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_TIME1PER_30"),
    > Bytes.toBytes(line.getString(1)))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_TIME2PER_30"),
    > Bytes.toBytes(line.getString(2)))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_TIME3PER_30"),
    > Bytes.toBytes(line.getString(3)))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_TIME4PER_30"),
    > Bytes.toBytes(line.getString(4)))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_HASCALL_1"),
    > Bytes.toBytes(line.getLong(5).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_HASCALL_3"),
    > Bytes.toBytes(line.getLong(6).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_HASCALL_6"),
    > Bytes.toBytes(line.getLong(7).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_NOCALL_1"),
    > Bytes.toBytes(line.getLong(8).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_NOCALL_3"),
    > Bytes.toBytes(line.getLong(9).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("CALLDT_NOCALL_6"),
    > Bytes.toBytes(line.getLong(10).toString))
    > put.addColumn(Bytes.toBytes("CF"), Bytes.toBytes("DB_TIME"),
    > Bytes.toBytes(time))
    > (new ImmutableBytesWritable, put)
    > })
    > }).saveAsNewAPIHadoopDataset(shuliStreaming.indexTable)
    > } catch {
    > case e: Exception =>
    > shuliStreaming.WriteIn.writeLog("shuli", time, "静默期&近几
月是否通话储错误", e)
    > e.printStackTrace()
    > println("静默期&近几月是否通话储错误" + e)
    > }
    > }
    > error:
    > 17/09/24 23:04:17 INFO spark.CacheManager: Partition rdd_11_1 not found,
    > computing it
    > 17/09/24 23:04:17 INFO rdd.HadoopRDD: Input split:
    > hdfs://nameservice1/data/input/common/phlibrary/OFFLINEPHONELIBRARY.dat:
    > 1146925+1146926
    > 17/09/24 23:04:17 INFO broadcast.TorrentBroadcast: Started reading
    > broadcast variable 1
    > 17/09/24 23:04:17 ERROR executor.Executor: Exception in task 1.0 in stage
    > 250804.0 (TID 3190467)
    > java.io.IOException: org.apache.spark.SparkException: Failed to get
    > broadcast_1_piece0 of broadcast_1
    > at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1223)
    > at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(
    > TorrentBroadcast.scala:165)
    > at org.apache.spark.broadcast.TorrentBroadcast._value$
    > lzycompute(TorrentBroadcast.scala:64)
    > at org.apache.spark.broadcast.TorrentBroadcast._value(
    > TorrentBroadcast.scala:64)
    > at org.apache.spark.broadcast.TorrentBroadcast.getValue(
    > TorrentBroadcast.scala:88)
    > at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
    > at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:144)
    > at org.apache.spark.rdd.HadoopRDD$$anon$1.(HadoopRDD.scala:212)
    > at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:208)
    > at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:101)
    > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    > at org.apache.spark.rdd.MapPartitionsRDD.compute(
    > MapPartitionsRDD.scala:38)
    > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    > at org.apache.spark.rdd.MapPartitionsRDD.compute(
    > MapPartitionsRDD.scala:38)
    > ------------------------------
    > You can view, comment on, or merge this pull request online at:
    >
    >   https://github.com/apache/spark/pull/19335
    > Commit Summary
    >
    >    - [SPARK-13969][ML] Add FeatureHasher transformer
    >    - [SPARK-21656][CORE] spark dynamic allocation should not idle timeout
    >    executors when tasks still to run
    >    - [SPARK-21603][SQL] The wholestage codegen will be much slower then
    >    that is closed when the function is too long
    >    - [SPARK-21738] Thriftserver doesn't cancel jobs when session is closed
    >    - [SPARK-21680][ML][MLLIB] optimize Vector compress
    >    - [SPARK-3151][BLOCK MANAGER] DiskStore.getBytes fails for files
    >    larger than 2GB
    >    - [SPARK-21743][SQL] top-most limit should not cause memory leak
    >    - [SPARK-21642][CORE] Use FQDN for DRIVER_HOST_ADDRESS instead of ip
    >    address
    >    - [SPARK-21428] Turn IsolatedClientLoader off while using builtin Hive
    >    jars for reusing CliSessionState
    >    - [SQL][MINOR][TEST] Set spark.unsafe.exceptionOnMemoryLeak to true
    >    - [SPARK-18394][SQL] Make an AttributeSet.toSeq output order consistent
    >    - [SPARK-16742] Mesos Kerberos Support
    >    - [SPARK-21677][SQL] json_tuple throws NullPointException when column
    >    is null as string type
    >    - [SPARK-21767][TEST][SQL] Add Decimal Test For Avro in VersionSuite
    >    - [SPARK-21739][SQL] Cast expression should initialize timezoneId when
    >    it is called statically to convert something into TimestampType
    >    - [SPARK-21778][SQL] Simpler Dataset.sample API in Scala / Java
    >    - [SPARK-21213][SQL] Support collecting partition-level statistics:
    >    rowCount and sizeInBytes
    >    - [SPARK-21743][SQL][FOLLOW-UP] top-most limit should not cause memory
    >    leak
    >    - [MINOR][TYPO] Fix typos: runnning and Excecutors
    >    - [SPARK-21566][SQL][PYTHON] Python method for summary
    >    - [SPARK-21790][TESTS] Fix Docker-based Integration Test errors.
    >    - [MINOR] Correct validateAndTransformSchema in GaussianMixture and
    >    AFTSurvivalRegression
    >    - [SPARK-21773][BUILD][DOCS] Installs mkdocs if missing in the path in
    >    SQL documentation build
    >    - [SPARK-21721][SQL][FOLLOWUP] Clear FileSystem deleteOnExit cache
    >    when paths are successfully removed
    >    - [SPARK-21782][CORE] Repartition creates skews when numPartitions is
    >    a power of 2
    >    - [SPARK-21718][SQL] Heavy log of type: "Skipping partition based on
    >    stats ..."
    >    - [SPARK-21468][PYSPARK][ML] Python API for FeatureHasher
    >    - [SPARK-21790][TESTS][FOLLOW-UP] Add filter pushdown verification
    >    back.
    >    - [SPARK-21617][SQL] Store correct table metadata when altering schema
    >    in Hive metastore.
    >    - [SPARK-19762][ML][FOLLOWUP] Add necessary comments to
    >    L2Regularization.
    >    - [SPARK-21070][PYSPARK] Attempt to update cloudpickle again
    >    - [SPARK-21584][SQL][SPARKR] Update R method for summary to call new
    >    implementation
    >    - [SPARK-21803][TEST] Remove the HiveDDLCommandSuite
    >    - [SPARK-20641][CORE] Add missing kvstore module in Laucher and
    >    SparkSubmit code
    >    - [SPARK-21499][SQL] Support creating persistent function for Spark
    >    UDAF(UserDefinedAggregateFunction)
    >    - [SPARK-21769][SQL] Add a table-specific option for always respecting
    >    schemas inferred/controlled by Spark SQL
    >    - [SPARK-21681][ML] fix bug of MLOR do not work correctly when
    >    featureStd contains zero
    >    - [SPARK-10931][ML][PYSPARK] PySpark Models Copy Param Values from
    >    Estimator
    >    - [SPARK-21765] Set isStreaming on leaf nodes for streaming plans.
    >    - [ML][MINOR] Make sharedParams update.
    >    - [SPARK-19326] Speculated task attempts do not get launched in few
    >    scenarios
    >    - [SPARK-12664][ML] Expose probability in mlp model
    >    - [SPARK-21501] Change CacheLoader to limit entries based on memory
    >    footprint
    >    - [SPARK-21603][SQL][FOLLOW-UP] Change the default value of
    >    maxLinesPerFunction into 4000
    >    - [SPARK-21807][SQL] Override ++ operation in ExpressionSet to reduce
    >    clone time
    >    - [SPARK-21805][SPARKR] Disable R vignettes code on Windows
    >    - [SPARK-21694][MESOS] Support Mesos CNI network labels
    >    - [MINOR][SQL] The comment of Class ExchangeCoordinator exist a typing
    >    and context error
    >    - [SPARK-21804][SQL] json_tuple returns null values within repeated
    >    columns except the first one
    >    - [SPARK-19165][PYTHON][SQL] PySpark APIs using columns as arguments
    >    should validate input types for column
    >    - [SPARK-21745][SQL] Refactor ColumnVector hierarchy to make
    >    ColumnVector read-only and to introduce WritableColumnVector.
    >    - [SPARK-21759][SQL] In.checkInputDataTypes should not wrongly report
    >    unresolved plans for IN correlated subquery
    >    - [SPARK-21826][SQL] outer broadcast hash join should not throw NPE
    >    - [SPARK-21788][SS] Handle more exceptions when stopping a streaming
    >    query
    >    - [SPARK-21701][CORE] Enable RPC client to use ` SO_RCVBUF` and `
    >    SO_SNDBUF` in SparkConf.
    >    - [SPARK-21830][SQL] Bump ANTLR version and fix a few issues.
    >    - [SPARK-21108][ML] convert LinearSVC to aggregator framework
    >    - [SPARK-21255][SQL][WIP] Fixed NPE when creating encoder for enum
    >    - [SPARK-21527][CORE] Use buffer limit in order to use JAVA NIO Util's
    >    buffercache
    >    - [MINOR][BUILD] Fix build warnings and Java lint errors
    >    - [SPARK-21832][TEST] Merge SQLBuilderTest into
    >    ExpressionSQLBuilderSuite
    >    - [SPARK-21714][CORE][YARN] Avoiding re-uploading remote resources in
    >    yarn client mode
    >    - [SPARK-17742][CORE] Fail launcher app handle if child process exits
    >    with error.
    >    - [SPARK-21756][SQL] Add JSON option to allow unquoted control
    >    characters
    >    - [SPARK-21837][SQL][TESTS] UserDefinedTypeSuite Local UDTs not
    >    actually testing what it intends
    >    - [SPARK-21831][TEST] Remove `spark.sql.hive.convertMetastoreOrc`
    >    config in HiveCompatibilitySuite
    >    - [MINOR][DOCS] Minor doc fixes related with doc build and uses script
    >    dir in SQL doc gen script
    >    - [SPARK-21843] testNameNote should be "(minNumPostShufflePartitions:
    >    5)"
    >    - [SPARK-21818][ML][MLLIB] Fix bug of 
MultivariateOnlineSummarizer.variance
    >    generate negative result
    >    - [SPARK-21798] No config to replace deprecated SPARK_CLASSPATH config
    >    for launching daemons like History Server
    >    - [SPARK-19662][SCHEDULER][TEST] Add Fair Scheduler Unit Test coverage
    >    for different build cases
    >    - [SPARK-17139][ML] Add model summary for MultinomialLogisticRegression
    >    - [SPARK-21781][SQL] Modify DataSourceScanExec to use concrete
    >    ColumnVector type.
    >    - [SPARK-21848][SQL] Add trait UserDefinedExpression to identify
    >    user-defined functions
    >    - [SPARK-21255][SQL] simplify encoder for java enum
    >    - [SPARK-21801][SPARKR][TEST] unit test randomly fail with randomforest
    >    - [MINOR][ML] Document treatment of instance weights in logreg summary
    >    - [SPARK-21728][CORE] Allow SparkSubmit to use Logging.
    >    - [SPARK-21813][CORE] Modify TaskMemoryManager.MAXIMUM_PAGE_SIZE_BYTES
    >    comments
    >    - [SPARK-21845][SQL] Make codegen fallback of expressions configurable
    >    - [SPARK-20886][CORE] HadoopMapReduceCommitProtocol to handle
    >    FileOutputCommitter.getWorkPath==null
    >    - [MINOR][TEST] Off -heap memory leaks for unit tests
    >    - [SPARK-21873][SS] - Avoid using `return` inside
    >    `CachedKafkaConsumer.get`
    >    - [SPARK-21806][MLLIB] BinaryClassificationMetrics pr(): first point
    >    (0.0, 1.0) is misleading
    >    - [SPARK-21764][TESTS] Fix tests failures on Windows: resources not
    >    being closed and incorrect paths
    >    - [SPARK-21469][ML][EXAMPLES] Adding Examples for FeatureHasher
    >    - Revert "[SPARK-21845][SQL] Make codegen fallback of expressions
    >    configurable"
    >    - [MINOR][SQL][TEST] Test shuffle hash join while is not expected
    >    - [SPARK-21834] Incorrect executor request in case of dynamic
    >    allocation
    >    - [SPARK-21839][SQL] Support SQL config for ORC compression
    >    - [SPARK-21875][BUILD] Fix Java style bugs
    >    - [SPARK-11574][CORE] Add metrics StatsD sink
    >    - [SPARK-17321][YARN] Avoid writing shuffle metadata to disk if NM
    >    recovery is disabled
    >    - [SPARK-21534][SQL][PYSPARK] PickleException when creating dataframe
    >    from python row with empty bytearray
    >    - [SPARK-21583][SQL] Create a ColumnarBatch from ArrowColumnVectors
    >    - [SPARK-21878][SQL][TEST] Create SQLMetricsTestUtils
    >    - [SPARK-21886][SQL] Use SparkSession.internalCreateDataFrame to
    >    create…
    >    - [SPARK-20812][MESOS] Add secrets support to the dispatcher
    >    - [SPARK-21583][HOTFIX] Removed intercept in test causing failures
    >    - [SPARK-17107][SQL][FOLLOW-UP] Remove redundant pushdown rule for
    >    Union
    >    - [SPARK-21110][SQL] Structs, arrays, and other orderable datatypes
    >    should be usable in inequalities
    >    - [SPARK-17139][ML][FOLLOW-UP] Add convenient method `asBinary` for
    >    casting to BinaryLogisticRegressionSummary
    >    - [SPARK-21862][ML] Add overflow check in PCA
    >    - [SPARK-21779][PYTHON] Simpler DataFrame.sample API in Python
    >    - [SPARK-21789][PYTHON] Remove obsolete codes for parsing abstract
    >    schema strings
    >    - [SPARK-21728][CORE] Follow up: fix user config, auth in SparkSubmit
    >    logging.
    >    - [SPARK-21880][WEB UI] In the SQL table page, modify jobs trace
    >    information
    >    - [SPARK-14280][BUILD][WIP] Update change-version.sh and pom.xml to
    >    add Scala 2.12 profiles and enable 2.12 compilation
    >    - [SPARK-21895][SQL] Support changing database in HiveClient
    >    - [SPARK-21729][ML][TEST] Generic test for ProbabilisticClassifier to
    >    ensure consistent output columns
    >    - [SPARK-21891][SQL] Add TBLPROPERTIES to DDL statement: CREATE TABLE
    >    USING
    >    - [SPARK-21897][PYTHON][R] Add unionByName API to DataFrame in Python
    >    and R
    >    - [SPARK-21654][SQL] Complement SQL predicates expression description
    >    - [SPARK-21418][SQL] NoSuchElementException: None.get in
    >    DataSourceScanExec with sun.io.serialization.extendedDebugInfo=true
    >    - [SPARK-21913][SQL][TEST] withDatabase` should drop database with
    >    CASCADE
    >    - [SPARK-21903][BUILD] Upgrade scalastyle to 1.0.0.
    >    - [SPARK-20978][SQL] Bump up Univocity version to 2.5.4
    >    - [SPARK-21845][SQL][TEST-MAVEN] Make codegen fallback of expressions
    >    configurable
    >    - [SPARK-21925] Update trigger interval documentation in docs with
    >    behavior change in Spark 2.2
    >    - [SPARK-21652][SQL] Fix rule confliction between
    >    InferFiltersFromConstraints and ConstantPropagation
    >    - [MINOR][DOC] Update `Partition Discovery` section to enumerate all
    >    available file sources
    >    - [SPARK-18061][THRIFTSERVER] Add spnego auth support for ThriftServer
    >    thrift/http protocol
    >    - [SPARK-9104][CORE] Expose Netty memory metrics in Spark
    >    - [SPARK-21924][DOCS] Update structured streaming programming guide doc
    >    - [SPARK-19357][ML] Adding parallel model evaluation in ML tuning
    >    - [SPARK-21903][BUILD][FOLLOWUP] Upgrade scalastyle-maven-plugin and
    >    scalastyle as well in POM and SparkBuild.scala
    >    - [SPARK-21835][SQL] RewritePredicateSubquery should not produce
    >    unresolved query plans
    >    - [SPARK-21801][SPARKR][TEST] set random seed for predictable test
    >    - [SPARK-21765] Check that optimization doesn't affect isStreaming bit.
    >    - [SPARK-21901][SS] Define toString for StateOperatorProgress
    >    - Fixed pandoc dependency issue in python/setup.py
    >    - [SPARK-21835][SQL][FOLLOW-UP] RewritePredicateSubquery should not
    >    produce unresolved query plans
    >    - [SPARK-21912][SQL] ORC/Parquet table should not create invalid
    >    column names
    >    - [SPARK-21890] Credentials not being passed to add the tokens
    >    - [SPARK-13656][SQL] Delete spark.sql.parquet.cacheMetadata from
    >    SQLConf and docs
    >    - [SPARK-21939][TEST] Use TimeLimits instead of Timeouts
    >    - [SPARK-21950][SQL][PYTHON][TEST] pyspark.sql.tests.SQLTests2 should
    >    stop SparkContext.
    >    - [SPARK-21949][TEST] Tables created in unit tests should be dropped
    >    after use
    >    - [SPARK-21726][SQL] Check for structural integrity of the plan in
    >    Optimzer in test mode.
    >    - [SPARK-21936][SQL] backward compatibility test framework for
    >    HiveExternalCatalog
    >    - [SPARK-21726][SQL][FOLLOW-UP] Check for structural integrity of the
    >    plan in Optimzer in test mode
    >    - [SPARK-21946][TEST] fix flaky test: "alter table: rename cached
    >    table" in InMemoryCatalogedDDLSuite
    >    - [SPARK-15243][ML][SQL][PYTHON] Add missing support for unicode in
    >    Param methods & functions in dataframe
    >    - [SPARK-19866][ML][PYSPARK] Add local version of Word2Vec
    >    findSynonyms for spark.ml: Python API
    >    - [SPARK-21941] Stop storing unused attemptId in SQLTaskMetrics
    >    - [SPARK-21954][SQL] JacksonUtils should verify MapType's value type
    >    instead of key type
    >    - [MINOR][SQL] Correct DataFrame doc.
    >    - [SPARK-4131] Support "Writing data into the filesystem from queries"
    >    - [SPARK-20098][PYSPARK] dataType's typeName fix
    >    - [SPARK-21610][SQL] Corrupt records are not handled properly when
    >    creating a dataframe from a file
    >    - [BUILD][TEST][SPARKR] add sparksubmitsuite to appveyor tests
    >    - [SPARK-21856] Add probability and rawPrediction to MLPC for Python
    >    - [MINOR][SQL] remove unuse import class
    >    - [SPARK-21976][DOC] Fix wrong documentation for Mean Absolute Error.
    >    - [SPARK-14516][ML] Adding ClusteringEvaluator with the implementation
    >    of Cosine silhouette and squared Euclidean silhouette.
    >    - [SPARK-21610][SQL][FOLLOWUP] Corrupt records are not handled
    >    properly when creating a dataframe from a file
    >    - [DOCS] Fix unreachable links in the document
    >    - [SPARK-17642][SQL] support DESC EXTENDED/FORMATTED table column
    >    commands
    >    - [SPARK-21027][ML][PYTHON] Added tunable parallelism to one vs. rest
    >    in both Scala mllib and Pyspark
    >    - [SPARK-21368][SQL] TPCDSQueryBenchmark can't refer query files.
    >    - [SPARK-18608][ML] Fix double caching
    >    - [SPARK-21979][SQL] Improve QueryPlanConstraints framework
    >    - [SPARK-21513][SQL] Allow UDF to_json support converting MapType to
    >    json
    >    - [SPARK-21027][MINOR][FOLLOW-UP] add missing since tag
    >    - [BUILD] Close stale PRs
    >    - [SPARK-21982] Set locale to US
    >    - [SPARK-21893][BUILD][STREAMING][WIP] Put Kafka 0.8 behind a profile
    >    - [SPARK-21963][CORE][TEST] Create temp file should be delete after use
    >    - [SPARK-21690][ML] one-pass imputer
    >    - [SPARK-21970][CORE] Fix Redundant Throws Declarations in Java
    >    Codebase
    >    - [SPARK-21980][SQL] References in grouping functions should be
    >    indexed with semanticEquals
    >    - [SPARK-4131] Merge HiveTmpFile.scala to SaveAsHiveFile.scala
    >    - [SPARK-20427][SQL] Read JDBC table use custom schema
    >    - [SPARK-21973][SQL] Add an new option to filter queries in TPC-DS
    >    - [MINOR][SQL] Only populate type metadata for required types such as
    >    CHAR/VARCHAR.
    >    - [SPARK-21854] Added LogisticRegressionTrainingSummary for
    >    MultinomialLogisticRegression in Python API
    >    - [MINOR][DOC] Add missing call of `update()` in examples of
    >    PeriodicGraphCheckpointer & PeriodicRDDCheckpointer
    >    - [SPARK-18608][ML][FOLLOWUP] Fix double caching for PySpark OneVsRest.
    >    - [SPARK-4131][FOLLOW-UP] Support "Writing data into the filesystem
    >    from queries"
    >    - [SPARK-21922] Fix duration always updating when task failed but
    >    status is still RUN…
    >    - [SPARK-17642][SQL][FOLLOWUP] drop test tables and improve comments
    >    - [SPARK-21988] Add default stats to StreamingExecutionRelation.
    >    - [SPARK-21513][SQL][FOLLOWUP] Allow UDF to_json support converting
    >    MapType to json for PySpark and SparkR
    >    - [SPARK-22018][SQL] Preserve top-level alias metadata when collapsing
    >    projects
    >    - [SPARK-21902][CORE] Print root cause for BlockManager#doPut
    >    - [SPARK-22002][SQL] Read JDBC table use custom schema support specify
    >    partial fields.
    >    - [SPARK-21987][SQL] fix a compatibility issue of sql event logs
    >    - [SPARK-21958][ML] Word2VecModel save: transform data in the cluster
    >    - [SPARK-15689][SQL] data source v2 read path
    >    - [SPARK-22017] Take minimum of all watermark execs in StreamExecution.
    >    - [SPARK-21967][CORE] 
org.apache.spark.unsafe.types.UTF8String#compareTo
    >    Should Compare 8 Bytes at a Time for Better Performance
    >    - [SPARK-22032][PYSPARK] Speed up StructType conversion
    >    - [SPARK-21985][PYSPARK] PairDeserializer is broken for double-zipped
    >    RDDs
    >    - [SPARK-21953] Show both memory and disk bytes spilled if either is
    >    present
    >    - [SPARK-22043][PYTHON] Improves error message for show_profiles and
    >    dump_profiles
    >    - [SPARK-21113][CORE] Read ahead input stream to amortize disk IO cost
    >    …
    >    - [SPARK-22047][TEST] ignore HiveExternalCatalogVersionsSuite
    >    - [SPARK-22003][SQL] support array column in vectorized reader with UDF
    >    - [SPARK-14878][SQL] Trim characters string function support
    >    - [SPARK-22030][CORE] GraphiteSink fails to re-connect to Graphite
    >    instances behind an ELB or any other auto-scaled LB
    >    - [SPARK-22047][FLAKY TEST] HiveExternalCatalogVersionsSuite
    >    - [SPARK-21923][CORE] Avoid calling reserveUnrollMemoryForThisTask for
    >    every record
    >    - [MINOR][CORE] Cleanup dead code and duplication in Mem. Management
    >    - [SPARK-22052] Incorrect Metric assigned in MetricsReporter.scala
    >    - [SPARK-21428][SQL][FOLLOWUP] CliSessionState should point to the
    >    actual metastore not a dummy one
    >    - [SPARK-21917][CORE][YARN] Supporting adding http(s) resources in
    >    yarn mode
    >    - [MINOR][ML] Remove unnecessary default value setting for evaluators.
    >    - [SPARK-21338][SQL] implement isCascadingTruncateTable() method in
    >    AggregatedDialect
    >    - [SPARK-21969][SQL] CommandUtils.updateTableStats should call
    >    refreshTable
    >    - [SPARK-22067][SQL] ArrowWriter should use position when setting
    >    UTF8String ByteBuffer
    >    - [SPARK-18838][CORE] Add separate listener queues to LiveListenerBus.
    >    - [SPARK-21977] SinglePartition optimizations break certain Streaming
    >    Stateful Aggregation requirements
    >    - [SPARK-22066][BUILD] Update checkstyle to 8.2, enable it, fix
    >    violations
    >    - [SPARK-22066][BUILD][HOTFIX] Revert scala-maven-plugin to 3.2.2 to
    >    work with Maven+zinc again
    >    - [SPARK-22049][DOCS] Confusing behavior of from_utc_timestamp and
    >    to_utc_timestamp
    >    - [SPARK-22076][SQL] Expand.projections should not be a Stream
    >    - [SPARK-18838][HOTFIX][YARN] Check internal context state before
    >    stopping it.
    >    - [SPARK-21384][YARN] Spark + YARN fails with LocalFileSystem as
    >    default FS
    >    - [SPARK-22076][SQL][FOLLOWUP] Expand.projections should not be a
    >    Stream
    >    - [SPARK-21934][CORE] Expose Shuffle Netty memory usage to
    >    MetricsSystem
    >    - [SPARK-21780][R] Simpler Dataset.sample API in R
    >    - [SPARK-17997][SQL] Add an aggregation function for counting distinct
    >    values for multiple intervals
    >    - [SPARK-22086][DOCS] Add expression description for CASE WHEN
    >    - [SPARK-21977][HOTFIX] Adjust EnsureStatefulOpPartitioningSuite to
    >    use scalatest lifecycle normally instead of constructor
    >    - [SPARK-21928][CORE] Set classloader on SerializerManager's private
    >    kryo
    >    - [INFRA] Close stale PRs.
    >    - [SPARK-22088][SQL] Incorrect scalastyle comment causes wrong styles
    >    in stringExpressions
    >    - [SPARK-22075][ML] GBTs unpersist datasets cached by Checkpointer
    >    - [SPARK-22009][ML] Using treeAggregate improve some algs
    >    - [SPARK-22053][SS] Stream-stream inner join in Append Mode
    >    - [SPARK-22094][SS] processAllAvailable should check the query state
    >    - [SPARK-21981][PYTHON][ML] Added Python interface for
    >    ClusteringEvaluator
    >    - [SPARK-21998][SQL] SortMergeJoinExec did not calculate its
    >    outputOrdering correctly during physical planning
    >    - [SPARK-22072][SPARK-22071][BUILD] Improve release build scripts
    >    - [SPARK-21190][PYSPARK] Python Vectorized UDFs
    >    - [UI][STREAMING] Modify the title, 'Records' instead of 'Input Size'
    >    - [SPARK-22092] Reallocation in OffHeapColumnVector.reserveInternal
    >    corrupts struct and array data
    >    - [SPARK-21766][PYSPARK][SQL] DataFrame toPandas() raises ValueError
    >    with nullable int columns
    >    - [SPARK-22060][ML] Fix CrossValidator/TrainValidationSplit param
    >    persist/load bug
    >    - [SPARK-18136] Fix SPARK_JARS_DIR for Python pip install on Windows
    >    - [SPARK-22099] The 'job ids' list style needs to be changed in the
    >    SQL page.
    >    - [SPARK-22033][CORE] BufferHolder, other size checks should account
    >    for the specific VM array size limitations
    >    - [SPARK-22109][SQL] Resolves type conflicts between strings and
    >    timestamps in partition column
    >    - [SPARK-20448][DOCS] Document how FileInputDStream works with object
    >    storage
    >    - [SPARK-22110][SQL][DOCUMENTATION] Add usage and improve
    >    documentation with arguments and examples for trim function
    >    - [SPARK-21338][SQL][FOLLOW-UP] Implement isCascadingTruncateTable()
    >    method in AggregatedDialect
    >    - [SPARK-22093][TESTS] Fixes `assume` in `UtilsSuite` and
    >    `HiveDDLSuite`
    >    - [SPARK-22058][CORE] the BufferedInputStream will not be closed if an
    >    exception occurs.
    >    - [SPARK-22087][SPARK-14650][WIP][BUILD][REPL][CORE] Compile Spark
    >    REPL for Scala 2.12 + other 2.12 fixes
    >    - [SPARK-22107] Change as to alias in python quickstart
    >
    > File Changes
    >
    >    - *A* .github/PULL_REQUEST_TEMPLATE
    >    <https://github.com/apache/spark/pull/19335/files#diff-0> (10)
    >    - *M* .gitignore
    >    <https://github.com/apache/spark/pull/19335/files#diff-1> (113)
    >    - *D* .rat-excludes
    >    <https://github.com/apache/spark/pull/19335/files#diff-2> (85)
    >    - *A* .travis.yml
    >    <https://github.com/apache/spark/pull/19335/files#diff-3> (50)
    >    - *M* CONTRIBUTING.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-4> (4)
    >    - *M* LICENSE <https://github.com/apache/spark/pull/19335/files#diff-5>
    >    (38)
    >    - *M* NOTICE <https://github.com/apache/spark/pull/19335/files#diff-6>
    >    (78)
    >    - *M* R/.gitignore
    >    <https://github.com/apache/spark/pull/19335/files#diff-7> (2)
    >    - *A* R/CRAN_RELEASE.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-8> (91)
    >    - *M* R/DOCUMENTATION.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-9> (12)
    >    - *M* R/README.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-10> (52)
    >    - *M* R/WINDOWS.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-11> (33)
    >    - *A* R/check-cran.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-12> (76)
    >    - *M* R/create-docs.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-13> (24)
    >    - *A* R/create-rd.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-14> (37)
    >    - *A* R/find-r.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-15> (34)
    >    - *M* R/install-dev.bat
    >    <https://github.com/apache/spark/pull/19335/files#diff-16> (6)
    >    - *M* R/install-dev.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-17> (16)
    >    - *A* R/install-source-package.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-18> (57)
    >    - *A* R/pkg/.Rbuildignore
    >    <https://github.com/apache/spark/pull/19335/files#diff-19> (9)
    >    - *M* R/pkg/.lintr
    >    <https://github.com/apache/spark/pull/19335/files#diff-20> (2)
    >    - *M* R/pkg/DESCRIPTION
    >    <https://github.com/apache/spark/pull/19335/files#diff-21> (52)
    >    - *M* R/pkg/NAMESPACE
    >    <https://github.com/apache/spark/pull/19335/files#diff-22> (240)
    >    - *M* R/pkg/R/DataFrame.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-23> (3301)
    >    - *M* R/pkg/R/RDD.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-24> (1742)
    >    - *M* R/pkg/R/SQLContext.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-25> (830)
    >    - *A* R/pkg/R/WindowSpec.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-26> (224)
    >    - *M* R/pkg/R/backend.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-27> (25)
    >    - *M* R/pkg/R/broadcast.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-28> (9)
    >    - *A* R/pkg/R/catalog.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-29> (526)
    >    - *M* R/pkg/R/client.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-30> (16)
    >    - *M* R/pkg/R/column.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-31> (178)
    >    - *M* R/pkg/R/context.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-32> (470)
    >    - *M* R/pkg/R/deserialize.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-33> (37)
    >    - *M* R/pkg/R/functions.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-34> (3195)
    >    - *M* R/pkg/R/generics.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-35> (1085)
    >    - *M* R/pkg/R/group.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-36> (152)
    >    - *A* R/pkg/R/install.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-37> (312)
    >    - *M* R/pkg/R/jobj.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-38> (9)
    >    - *A* R/pkg/R/jvm.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-39> (117)
    >    - *D* R/pkg/R/mllib.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-40> (101)
    >    - *A* R/pkg/R/mllib_classification.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-41> (635)
    >    - *A* R/pkg/R/mllib_clustering.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-42> (634)
    >    - *A* R/pkg/R/mllib_fpm.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-43> (162)
    >    - *A* R/pkg/R/mllib_recommendation.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-44> (162)
    >    - *A* R/pkg/R/mllib_regression.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-45> (552)
    >    - *A* R/pkg/R/mllib_stat.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-46> (127)
    >    - *A* R/pkg/R/mllib_tree.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-47> (765)
    >    - *A* R/pkg/R/mllib_utils.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-48> (132)
    >    - *M* R/pkg/R/pairRDD.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-49> (970)
    >    - *M* R/pkg/R/schema.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-50> (103)
    >    - *M* R/pkg/R/serialize.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-51> (6)
    >    - *M* R/pkg/R/sparkR.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-52> (462)
    >    - *M* R/pkg/R/stats.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-53> (173)
    >    - *A* R/pkg/R/streaming.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-54> (214)
    >    - *A* R/pkg/R/types.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-55> (85)
    >    - *M* R/pkg/R/utils.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-56> (326)
    >    - *A* R/pkg/R/window.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-57> (116)
    >    - *M* R/pkg/inst/profile/general.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-58> (5)
    >    - *M* R/pkg/inst/profile/shell.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-59> (14)
    >    - *D* R/pkg/inst/test_support/sparktestjar_2.10-1.0.jar
    >    <https://github.com/apache/spark/pull/19335/files#diff-60> (0)
    >    - *D* R/pkg/inst/tests/jarTest.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-61> (32)
    >    - *D* R/pkg/inst/tests/packageInAJarTest.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-62> (30)
    >    - *D* R/pkg/inst/tests/test_Serde.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-63> (77)
    >    - *D* R/pkg/inst/tests/test_binaryFile.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-64> (89)
    >    - *D* R/pkg/inst/tests/test_binary_function.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-65> (101)
    >    - *D* R/pkg/inst/tests/test_broadcast.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-66> (48)
    >    - *D* R/pkg/inst/tests/test_client.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-67> (36)
    >    - *D* R/pkg/inst/tests/test_context.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-68> (94)
    >    - *D* R/pkg/inst/tests/test_includeJAR.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-69> (37)
    >    - *D* R/pkg/inst/tests/test_includePackage.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-70> (57)
    >    - *D* R/pkg/inst/tests/test_mllib.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-71> (86)
    >    - *D* R/pkg/inst/tests/test_parallelize_collect.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-72> (109)
    >    - *D* R/pkg/inst/tests/test_rdd.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-73> (793)
    >    - *D* R/pkg/inst/tests/test_shuffle.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-74> (221)
    >    - *D* R/pkg/inst/tests/test_sparkSQL.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-75> (1499)
    >    - *D* R/pkg/inst/tests/test_take.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-76> (66)
    >    - *D* R/pkg/inst/tests/test_textFile.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-77> (161)
    >    - *D* R/pkg/inst/tests/test_utils.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-78> (140)
    >    - *A* R/pkg/inst/tests/testthat/test_basic.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-79> (90)
    >    - *M* R/pkg/inst/worker/daemon.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-80> (62)
    >    - *M* R/pkg/inst/worker/worker.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-81> (135)
    >    - *A* R/pkg/tests/fulltests/jarTest.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-82> (32)
    >    - *A* R/pkg/tests/fulltests/packageInAJarTest.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-83> (30)
    >    - *A* R/pkg/tests/fulltests/test_Serde.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-84> (79)
    >    - *A* R/pkg/tests/fulltests/test_Windows.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-85> (27)
    >    - *A* R/pkg/tests/fulltests/test_binaryFile.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-86> (92)
    >    - *A* R/pkg/tests/fulltests/test_binary_function.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-87> (104)
    >    - *A* R/pkg/tests/fulltests/test_broadcast.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-88> (51)
    >    - *A* R/pkg/tests/fulltests/test_client.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-89> (43)
    >    - *A* R/pkg/tests/fulltests/test_context.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-90> (0)
    >    - *M* R/pkg/tests/fulltests/test_includePackage.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-91> (0)
    >    - *M* R/pkg/tests/fulltests/test_jvm_api.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-92> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_classification.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-93> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_clustering.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-94> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_fpm.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-95> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_recommendation.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-96> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_regression.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-97> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_stat.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-98> (0)
    >    - *M* R/pkg/tests/fulltests/test_mllib_tree.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-99> (0)
    >    - *M* R/pkg/tests/fulltests/test_parallelize_collect.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-100> (0)
    >    - *M* R/pkg/tests/fulltests/test_rdd.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-101> (0)
    >    - *M* R/pkg/tests/fulltests/test_shuffle.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-102> (0)
    >    - *M* R/pkg/tests/fulltests/test_sparkR.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-103> (0)
    >    - *M* R/pkg/tests/fulltests/test_sparkSQL.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-104> (0)
    >    - *M* R/pkg/tests/fulltests/test_streaming.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-105> (0)
    >    - *M* R/pkg/tests/fulltests/test_take.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-106> (0)
    >    - *M* R/pkg/tests/fulltests/test_textFile.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-107> (0)
    >    - *M* R/pkg/tests/fulltests/test_utils.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-108> (0)
    >    - *M* R/pkg/tests/run-all.R
    >    <https://github.com/apache/spark/pull/19335/files#diff-109> (0)
    >    - *M* R/pkg/vignettes/sparkr-vignettes.Rmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-110> (0)
    >    - *M* R/run-tests.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-111> (0)
    >    - *M* README.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-112> (0)
    >    - *M* appveyor.yml
    >    <https://github.com/apache/spark/pull/19335/files#diff-113> (0)
    >    - *M* assembly/README
    >    <https://github.com/apache/spark/pull/19335/files#diff-114> (0)
    >    - *M* assembly/pom.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-115> (0)
    >    - *M* assembly/src/main/assembly/assembly.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-116> (0)
    >    - *M* bagel/pom.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-117> (0)
    >    - *M* bagel/src/main/scala/org/apache/spark/bagel/Bagel.scala
    >    <https://github.com/apache/spark/pull/19335/files#diff-118> (0)
    >    - *M* bagel/src/main/scala/org/apache/spark/bagel/package-info.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-119> (0)
    >    - *M* bagel/src/main/scala/org/apache/spark/bagel/package.scala
    >    <https://github.com/apache/spark/pull/19335/files#diff-120> (0)
    >    - *M* bin/beeline
    >    <https://github.com/apache/spark/pull/19335/files#diff-121> (0)
    >    - *M* bin/beeline.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-122> (0)
    >    - *M* bin/find-spark-home
    >    <https://github.com/apache/spark/pull/19335/files#diff-123> (0)
    >    - *M* bin/load-spark-env.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-124> (0)
    >    - *M* bin/load-spark-env.sh
    >    <https://github.com/apache/spark/pull/19335/files#diff-125> (0)
    >    - *M* bin/pyspark
    >    <https://github.com/apache/spark/pull/19335/files#diff-126> (0)
    >    - *M* bin/pyspark.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-127> (0)
    >    - *M* bin/pyspark2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-128> (0)
    >    - *M* bin/run-example
    >    <https://github.com/apache/spark/pull/19335/files#diff-129> (0)
    >    - *M* bin/run-example.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-130> (0)
    >    - *M* bin/run-example2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-131> (0)
    >    - *M* bin/spark-class
    >    <https://github.com/apache/spark/pull/19335/files#diff-132> (0)
    >    - *M* bin/spark-class.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-133> (0)
    >    - *M* bin/spark-class2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-134> (0)
    >    - *M* bin/spark-shell
    >    <https://github.com/apache/spark/pull/19335/files#diff-135> (0)
    >    - *M* bin/spark-shell.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-136> (0)
    >    - *M* bin/spark-shell2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-137> (0)
    >    - *M* bin/spark-sql
    >    <https://github.com/apache/spark/pull/19335/files#diff-138> (0)
    >    - *M* bin/spark-submit
    >    <https://github.com/apache/spark/pull/19335/files#diff-139> (0)
    >    - *M* bin/spark-submit.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-140> (0)
    >    - *M* bin/spark-submit2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-141> (0)
    >    - *M* bin/sparkR
    >    <https://github.com/apache/spark/pull/19335/files#diff-142> (0)
    >    - *M* bin/sparkR.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-143> (0)
    >    - *M* bin/sparkR2.cmd
    >    <https://github.com/apache/spark/pull/19335/files#diff-144> (0)
    >    - *M* build/mvn
    >    <https://github.com/apache/spark/pull/19335/files#diff-145> (0)
    >    - *M* build/sbt-launch-lib.bash
    >    <https://github.com/apache/spark/pull/19335/files#diff-146> (0)
    >    - *M* build/spark-build-info
    >    <https://github.com/apache/spark/pull/19335/files#diff-147> (0)
    >    - *M* common/kvstore/pom.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-148> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    ArrayWrappers.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-149> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    InMemoryStore.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-150> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVIndex.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-151> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVStore.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-152> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVStoreIterator.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-153> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVStoreSerializer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-154> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVStoreView.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-155> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    KVTypeInfo.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-156> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    LevelDB.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-157> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    LevelDBIterator.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-158> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    LevelDBTypeInfo.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-159> (0)
    >    - *M* common/kvstore/src/main/java/org/apache/spark/util/kvstore/
    >    UnsupportedStoreVersionException.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-160> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    ArrayKeyIndexType.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-161> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    ArrayWrappersSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-162> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    CustomType1.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-163> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    DBIteratorSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-164> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    InMemoryIteratorSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-165> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    InMemoryStoreSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-166> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    LevelDBBenchmark.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-167> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    LevelDBIteratorSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-168> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    LevelDBSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-169> (0)
    >    - *M* common/kvstore/src/test/java/org/apache/spark/util/kvstore/
    >    LevelDBTypeInfoSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-170> (0)
    >    - *R* common/kvstore/src/test/resources/log4j.properties
    >    <https://github.com/apache/spark/pull/19335/files#diff-171> (0)
    >    - *M* common/network-common/pom.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-172> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/TransportContext.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-173> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/buffer/FileSegmentManagedBuffer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-174> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/buffer/ManagedBuffer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-175> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/buffer/NettyManagedBuffer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-176> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/buffer/NioManagedBuffer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-177> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/client/ChunkFetchFailureException.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-178> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/client/ChunkReceivedCallback.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-179> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/RpcResponseCallback.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-180> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/StreamCallback.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-181> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/StreamInterceptor.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-182> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/TransportClient.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-183> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/client/TransportClientBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-184> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/TransportClientFactory.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-185> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/client/TransportResponseHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-186> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/AuthClientBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-187> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/AuthEngine.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-188> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/AuthRpcHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-189> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/AuthServerBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-190> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/ClientChallenge.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-191> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/README.md
    >    <https://github.com/apache/spark/pull/19335/files#diff-192> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/ServerResponse.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-193> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/crypto/TransportCipher.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-194> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/AbstractMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-195> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/AbstractResponseMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-196> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/ChunkFetchFailure.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-197> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/ChunkFetchRequest.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-198> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/ChunkFetchSuccess.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-199> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/Encodable.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-200> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/Encoders.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-201> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/Message.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-202> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/MessageDecoder.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-203> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/MessageEncoder.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-204> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/MessageWithHeader.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-205> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/OneWayMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-206> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/RequestMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-207> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/ResponseMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-208> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/RpcFailure.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-209> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/RpcRequest.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-210> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/RpcResponse.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-211> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/StreamChunkId.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-212> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/StreamFailure.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-213> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/StreamRequest.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-214> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/protocol/StreamResponse.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-215> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslClientBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-216> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslEncryption.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-217> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslEncryptionBackend.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-218> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-219> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslRpcHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-220> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SaslServerBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-221> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SecretKeyHolder.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-222> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SparkSaslClient.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-223> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/sasl/SparkSaslServer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-224> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/MessageHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-225> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/NoOpRpcHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-226> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/OneForOneStreamManager.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-227> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/RpcHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-228> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/StreamManager.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-229> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/TransportChannelHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-230> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/TransportRequestHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-231> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/server/TransportServer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-232> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/server/TransportServerBootstrap.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-233> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/ByteArrayReadableChannel.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-234> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/util/ByteArrayWritableChannel.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-235> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/ByteUnit.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-236> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/ConfigProvider.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-237> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/CryptoUtils.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-238> (0)
    >    - *R* common/network-common/src/main/java/org/apache/spark/
    >    network/util/IOMode.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-239> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/JavaUtils.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-240> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/LevelDBProvider.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-241> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/LimitedInputStream.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-242> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/MapConfigProvider.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-243> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/NettyMemoryMetrics.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-244> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/NettyUtils.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-245> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/TransportConf.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-246> (0)
    >    - *M* common/network-common/src/main/java/org/apache/spark/
    >    network/util/TransportFrameDecoder.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-247> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/network/
    >    ChunkFetchIntegrationSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-248> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/ProtocolSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-249> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/network/
    >    RequestTimeoutIntegrationSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-250> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/RpcIntegrationSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-251> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/StreamSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-252> (0)
    >    - *R* common/network-common/src/test/java/org/apache/spark/
    >    network/TestManagedBuffer.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-253> (0)
    >    - *R* common/network-common/src/test/java/org/apache/spark/
    >    network/TestUtils.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-254> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/network/
    >    TransportClientFactorySuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-255> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/network/
    >    TransportRequestHandlerSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-256> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/network/
    >    TransportResponseHandlerSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-257> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/crypto/AuthEngineSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-258> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/crypto/AuthIntegrationSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-259> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/crypto/AuthMessagesSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-260> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/protocol/MessageWithHeaderSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-261> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/sasl/SparkSaslSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-262> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/server/OneForOneStreamManagerSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-263> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/util/CryptoUtilsSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-264> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/util/NettyMemoryMetricsSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-265> (0)
    >    - *M* common/network-common/src/test/java/org/apache/spark/
    >    network/util/TransportFrameDecoderSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-266> (0)
    >    - *M* common/network-common/src/test/resources/log4j.properties
    >    <https://github.com/apache/spark/pull/19335/files#diff-267> (0)
    >    - *M* common/network-shuffle/pom.xml
    >    <https://github.com/apache/spark/pull/19335/files#diff-268> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/sasl/ShuffleSecretManager.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-269> (0)
    >    - *R* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/BlockFetchingListener.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-270> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ExternalShuffleBlockHandler.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-271> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ExternalShuffleBlockResolver.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-272> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ExternalShuffleClient.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-273> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/OneForOneBlockFetcher.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-274> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/RetryingBlockFetcher.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-275> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ShuffleClient.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-276> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ShuffleIndexInformation.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-277> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/ShuffleIndexRecord.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-278> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/TempShuffleFileManager.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-279> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/mesos/MesosExternalShuffleClient.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-280> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/BlockTransferMessage.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-281> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/ExecutorShuffleInfo.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-282> (0)
    >    - *R* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/OpenBlocks.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-283> (0)
    >    - *R* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/RegisterExecutor.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-284> (0)
    >    - *R* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/StreamHandle.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-285> (0)
    >    - *R* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/UploadBlock.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-286> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/mesos/RegisterDriver.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-287> (0)
    >    - *M* common/network-shuffle/src/main/java/org/apache/spark/
    >    network/shuffle/protocol/mesos/ShuffleServiceHeartbeat.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-288> (0)
    >    - *M* common/network-shuffle/src/test/java/org/apache/spark/
    >    network/sasl/SaslIntegrationSuite.java
    >    <https://github.com/apache/spark/pull/19335/files#diff-28


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to