Repository: spark
Updated Branches:
refs/heads/branch-1.5 d213aa77c -> ae18342a5
[SPARK-9918] [MLLIB] remove runs from k-means and rename epsilon to tol
This requires some discussion. I'm not sure whether `runs` is a useful
parameter. It certainly complicates the implementation. We might want
Repository: spark
Updated Branches:
refs/heads/master d0b18919d -> 68f995714
[SPARK-9918] [MLLIB] remove runs from k-means and rename epsilon to tol
This requires some discussion. I'm not sure whether `runs` is a useful
parameter. It certainly complicates the implementation. We might want to
Repository: spark
Updated Branches:
refs/heads/branch-1.5 694e7a3c4 -> d213aa77c
[SPARK-9914] [ML] define setters explicitly for Java and use setParam group in
RFormula
The problem with defining setters in the base class is that it doesn't return
the correct type in Java.
ericl
Author: Xia
Repository: spark
Updated Branches:
refs/heads/branch-1.5 690284037 -> 694e7a3c4
[SPARK-9927] [SQL] Revert 8049 since it's pushing wrong filter down
I made a mistake in #8049 by casting literal value to attribute's data type,
which would cause simply truncate the literal value and push a wron
Repository: spark
Updated Branches:
refs/heads/master d7eb371eb -> d0b18919d
[SPARK-9927] [SQL] Revert 8049 since it's pushing wrong filter down
I made a mistake in #8049 by casting literal value to attribute's data type,
which would cause simply truncate the literal value and push a wrong fi
Repository: spark
Updated Branches:
refs/heads/master df5438921 -> d7eb371eb
[SPARK-9914] [ML] define setters explicitly for Java and use setParam group in
RFormula
The problem with defining setters in the base class is that it doesn't return
the correct type in Java.
ericl
Author: Xiangru
Repository: spark
Updated Branches:
refs/heads/master 5fc058a1f -> df5438921
[SPARK-8922] [DOCUMENTATION, MLLIB] Add @since tags to mllib.evaluation
Author: shikai.tang
Closes #7429 from mosessky/master.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.a
Repository: spark
Updated Branches:
refs/heads/branch-1.5 8f055e595 -> 690284037
[SPARK-8922] [DOCUMENTATION, MLLIB] Add @since tags to mllib.evaluation
Author: shikai.tang
Closes #7429 from mosessky/master.
(cherry picked from commit df543892122342b97e5137b266959ba97589b3ef)
Signed-off-by:
Preparing development version 1.5.0-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8f055e59
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8f055e59
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8f0
Repository: spark
Updated Tags: refs/tags/v1.5.0-preview-20150812 [created] cedce9bdb
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Branches:
refs/heads/branch-1.5 16f4bf4ca -> 8f055e595
Preparing Spark release v1.5.0-preview-20150812
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/cedce9bd
Tree: http://git-wip-us.apache.
Repository: spark
Updated Branches:
refs/heads/branch-1.5 8229437c3 -> 16f4bf4ca
[SPARK-9917] [ML] add getMin/getMax and doc for originalMin/origianlMax in
MinMaxScaler
hhbyyh
Author: Xiangrui Meng
Closes #8145 from mengxr/SPARK-9917.
(cherry picked from commit 5fc058a1fc5d83ad53feec93647
Repository: spark
Updated Branches:
refs/heads/master a8ab2634c -> 5fc058a1f
[SPARK-9917] [ML] add getMin/getMax and doc for originalMin/origianlMax in
MinMaxScaler
hhbyyh
Author: Xiangrui Meng
Closes #8145 from mengxr/SPARK-9917.
Project: http://git-wip-us.apache.org/repos/asf/spark/rep
Repository: spark
Updated Branches:
refs/heads/branch-1.5 3b1b8ea3e -> 8229437c3
[SPARK-9832] [SQL] add a thread-safe lookup for BytesToBytseMap
This patch add a thread-safe lookup for BytesToBytseMap, and use that in
broadcasted HashedRelation.
Author: Davies Liu
Closes #8151 from davies/
Repository: spark
Updated Branches:
refs/heads/master 227821905 -> a8ab2634c
[SPARK-9832] [SQL] add a thread-safe lookup for BytesToBytseMap
This patch add a thread-safe lookup for BytesToBytseMap, and use that in
broadcasted HashedRelation.
Author: Davies Liu
Closes #8151 from davies/safe
Repository: spark
Updated Branches:
refs/heads/branch-1.5 3d1b9f007 -> 3b1b8ea3e
[SPARK-9920] [SQL] The simpleString of TungstenAggregate does not show its
output
https://issues.apache.org/jira/browse/SPARK-9920
Taking `sqlContext.sql("select i, sum(j1) as sum from testAgg group by
i").expl
Repository: spark
Updated Branches:
refs/heads/master 2fb4901b7 -> 227821905
[SPARK-9920] [SQL] The simpleString of TungstenAggregate does not show its
output
https://issues.apache.org/jira/browse/SPARK-9920
Taking `sqlContext.sql("select i, sum(j1) as sum from testAgg group by
i").explain(
Repository: spark
Updated Branches:
refs/heads/branch-1.5 af470a757 -> 3d1b9f007
[SPARK-9916] [BUILD] [SPARKR] removed left-over sparkr.zip copy/create commands
from codebase
sparkr.zip is now built by SparkSubmit on a need-to-build basis.
cc shivaram
Author: Burak Yavuz
Closes #8147 from
Repository: spark
Updated Branches:
refs/heads/master d7053bea9 -> 2fb4901b7
[SPARK-9916] [BUILD] [SPARKR] removed left-over sparkr.zip copy/create commands
from codebase
sparkr.zip is now built by SparkSubmit on a need-to-build basis.
cc shivaram
Author: Burak Yavuz
Closes #8147 from brk
Repository: spark
Updated Branches:
refs/heads/branch-1.5 a06860c2f -> af470a757
[SPARK-9903] [MLLIB] skip local processing in PrefixSpan if there are no small
prefixes
There exists a chance that the prefixes keep growing to the maximum pattern
length. Then the final local processing step be
Repository: spark
Updated Branches:
refs/heads/master d2d5e7fe2 -> d7053bea9
[SPARK-9903] [MLLIB] skip local processing in PrefixSpan if there are no small
prefixes
There exists a chance that the prefixes keep growing to the maximum pattern
length. Then the final local processing step become
Repository: spark
Updated Branches:
refs/heads/master 4413d0855 -> d2d5e7fe2
[SPARK-9704] [ML] Made ProbabilisticClassifier, Identifiable, VectorUDT public
APIs
Made ProbabilisticClassifier, Identifiable, VectorUDT public. All are
annotated as DeveloperApi.
CC: mengxr EronWright
Author: J
Repository: spark
Updated Branches:
refs/heads/branch-1.5 c182dc4a4 -> a06860c2f
[SPARK-9704] [ML] Made ProbabilisticClassifier, Identifiable, VectorUDT public
APIs
Made ProbabilisticClassifier, Identifiable, VectorUDT public. All are
annotated as DeveloperApi.
CC: mengxr EronWright
Autho
Repository: spark
Updated Branches:
refs/heads/branch-1.5 71ea61f90 -> c182dc4a4
[SPARK-9199] [CORE] Update Tachyon dependency from 0.7.0 -> 0.7.1.
Updates the tachyon-client version to the latest release.
The main difference between 0.7.0 and 0.7.1 on the client side is to support
running T
Repository: spark
Updated Branches:
refs/heads/master 7c35746c9 -> 4413d0855
[SPARK-9908] [SQL] When spark.sql.tungsten.enabled is false, broadcast join
does not work
https://issues.apache.org/jira/browse/SPARK-9908
Author: Yin Huai
Closes #8149 from yhuai/SPARK-9908.
Project: http://git
Repository: spark
Updated Branches:
refs/heads/branch-1.5 eebb3f945 -> 71ea61f90
[SPARK-9908] [SQL] When spark.sql.tungsten.enabled is false, broadcast join
does not work
https://issues.apache.org/jira/browse/SPARK-9908
Author: Yin Huai
Closes #8149 from yhuai/SPARK-9908.
(cherry picked f
Repository: spark
Updated Branches:
refs/heads/branch-1.5 4b547b91d -> eebb3f945
[SPARK-9827] [SQL] fix fd leak in UnsafeRowSerializer
Currently, UnsafeRowSerializer does not close the InputStream, will cause fd
leak if the InputStream has an open fd in it.
TODO: the fd could still be leaked
Repository: spark
Updated Branches:
refs/heads/master 7b13ed27c -> 7c35746c9
[SPARK-9827] [SQL] fix fd leak in UnsafeRowSerializer
Currently, UnsafeRowSerializer does not close the InputStream, will cause fd
leak if the InputStream has an open fd in it.
TODO: the fd could still be leaked, if
Repository: spark
Updated Branches:
refs/heads/branch-1.5 ca39c9e91 -> 4b547b91d
[SPARK-9870] Disable driver UI and Master REST server in SparkSubmitSuite
I think that we should pass additional configuration flags to disable the
driver UI and Master REST server in SparkSubmitSuite and HiveSpa
Repository: spark
Updated Branches:
refs/heads/master f4bc01f1f -> 7b13ed27c
[SPARK-9870] Disable driver UI and Master REST server in SparkSubmitSuite
I think that we should pass additional configuration flags to disable the
driver UI and Master REST server in SparkSubmitSuite and HiveSparkSu
Repository: spark
Updated Branches:
refs/heads/master 0d1d146c2 -> f4bc01f1f
[SPARK-9855] [SPARKR] Add expression functions into SparkR whose params are
simple
I added lots of expression functions for SparkR. This PR includes only
functions whose params are only `(Column)` or `(Column, Colu
Repository: spark
Updated Branches:
refs/heads/branch-1.5 62ab2a4c6 -> ca39c9e91
[SPARK-9855] [SPARKR] Add expression functions into SparkR whose params are
simple
I added lots of expression functions for SparkR. This PR includes only
functions whose params are only `(Column)` or `(Column,
Repository: spark
Updated Branches:
refs/heads/master 8ce60963c -> 0d1d146c2
[SPARK-9724] [WEB UI] Avoid unnecessary redirects in the Spark Web UI.
Author: Rohit Agarwal
Closes #8014 from mindprince/SPARK-9724 and squashes the following commits:
a7af5ff [Rohit Agarwal] [SPARK-9724] [WEB UI]
Repository: spark
Updated Branches:
refs/heads/master 660e6dcff -> 8ce60963c
[SPARK-9780] [STREAMING] [KAFKA] prevent NPE if KafkaRDD instantiation â¦
â¦fails
Author: cody koeninger
Closes #8133 from koeninger/SPARK-9780 and squashes the following commits:
406259d [cody koeninger] [SPARK
Repository: spark
Updated Branches:
refs/heads/branch-1.5 3298fb69f -> 62ab2a4c6
[SPARK-9780] [STREAMING] [KAFKA] prevent NPE if KafkaRDD instantiation â¦
â¦fails
Author: cody koeninger
Closes #8133 from koeninger/SPARK-9780 and squashes the following commits:
406259d [cody koeninger] [S
Repository: spark
Updated Branches:
refs/heads/branch-1.4 89c8aea94 -> 8ce86b23f
[SPARK-9826] [CORE] Fix cannot use custom classes in log4j.properties
Refactor Utils class and create ShutdownHookManager.
NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted l
Repository: spark
Updated Branches:
refs/heads/branch-1.5 ed73f5439 -> 3298fb69f
[SPARK-9449] [SQL] Include MetastoreRelation's inputFiles
Author: Michael Armbrust
Closes #8119 from marmbrus/metastoreInputFiles.
(cherry picked from commit 660e6dcff8125b83cc73dbe00c90cbe58744bc66)
Signed-off
Repository: spark
Updated Branches:
refs/heads/master fc1c7fd66 -> 660e6dcff
[SPARK-9449] [SQL] Include MetastoreRelation's inputFiles
Author: Michael Armbrust
Closes #8119 from marmbrus/metastoreInputFiles.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-
Repository: spark
Updated Branches:
refs/heads/branch-1.5 31b7fdc06 -> ed73f5439
[SPARK-9915] [ML] stopWords should use StringArrayParam
hhbyyh
Author: Xiangrui Meng
Closes #8141 from mengxr/SPARK-9915.
(cherry picked from commit fc1c7fd66e64ccea53b31cd2fbb98bc6d307329c)
Signed-off-by: Xia
Repository: spark
Updated Branches:
refs/heads/master e6aef5576 -> fc1c7fd66
[SPARK-9915] [ML] stopWords should use StringArrayParam
hhbyyh
Author: Xiangrui Meng
Closes #8141 from mengxr/SPARK-9915.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apach
Repository: spark
Updated Branches:
refs/heads/master 6e409bc13 -> e6aef5576
[SPARK-9912] [MLLIB] QRDecomposition should use QType and RType for type names
instead of UType and VType
hhbyyh
Author: Xiangrui Meng
Closes #8140 from mengxr/SPARK-9912.
Project: http://git-wip-us.apache.org/r
Repository: spark
Updated Branches:
refs/heads/branch-1.5 2f8793b5f -> 31b7fdc06
[SPARK-9912] [MLLIB] QRDecomposition should use QType and RType for type names
instead of UType and VType
hhbyyh
Author: Xiangrui Meng
Closes #8140 from mengxr/SPARK-9912.
(cherry picked from commit e6aef5576
Repository: spark
Updated Branches:
refs/heads/branch-1.5 6aca0cf34 -> 2f8793b5f
[SPARK-9909] [ML] [TRIVIAL] move weightCol to shared params
As per the TODO move weightCol to Shared Params.
Author: Holden Karau
Closes #8144 from holdenk/SPARK-9909-move-weightCol-toSharedParams.
(cherry pic
Repository: spark
Updated Branches:
refs/heads/master caa14d9dc -> 6e409bc13
[SPARK-9909] [ML] [TRIVIAL] move weightCol to shared params
As per the TODO move weightCol to Shared Params.
Author: Holden Karau
Closes #8144 from holdenk/SPARK-9909-move-weightCol-toSharedParams.
Project: http:
Repository: spark
Updated Branches:
refs/heads/branch-1.5 08f767a1e -> 6aca0cf34
[SPARK-9913] [MLLIB] LDAUtils should be private
feynmanliang
Author: Xiangrui Meng
Closes #8142 from mengxr/SPARK-9913.
(cherry picked from commit caa14d9dc9e2eb1102052b22445b63b0e004e3c7)
Signed-off-by: Xiang
Repository: spark
Updated Branches:
refs/heads/master 7035d880a -> caa14d9dc
[SPARK-9913] [MLLIB] LDAUtils should be private
feynmanliang
Author: Xiangrui Meng
Closes #8142 from mengxr/SPARK-9913.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.
Repository: spark
Updated Branches:
refs/heads/branch-1.5 74c9dcec3 -> 08f767a1e
[SPARK-9894] [SQL] Json writer should handle MapData.
https://issues.apache.org/jira/browse/SPARK-9894
Author: Yin Huai
Closes #8137 from yhuai/jsonMapData.
(cherry picked from commit 7035d880a0cf06910c19b4afd
Repository: spark
Updated Branches:
refs/heads/master ab7e721cf -> 7035d880a
[SPARK-9894] [SQL] Json writer should handle MapData.
https://issues.apache.org/jira/browse/SPARK-9894
Author: Yin Huai
Closes #8137 from yhuai/jsonMapData.
Project: http://git-wip-us.apache.org/repos/asf/spark/r
Repository: spark
Updated Branches:
refs/heads/branch-1.5 8537e51d3 -> 74c9dcec3
[SPARK-9826] [CORE] Fix cannot use custom classes in log4j.properties
Refactor Utils class and create ShutdownHookManager.
NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted l
Repository: spark
Updated Branches:
refs/heads/master 738f35398 -> ab7e721cf
[SPARK-9826] [CORE] Fix cannot use custom classes in log4j.properties
Refactor Utils class and create ShutdownHookManager.
NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted local
Repository: spark
Updated Branches:
refs/heads/master a17384fa3 -> 738f35398
[SPARK-9092] Fixed incompatibility when both num-executors and dynamic...
⦠allocation are set. Now, dynamic allocation is set to false when
num-executors is explicitly specified as an argument. Consequently,
exec
Repository: spark
Updated Branches:
refs/heads/branch-1.5 b28295fe0 -> 8537e51d3
[SPARK-9092] Fixed incompatibility when both num-executors and dynamic...
⦠allocation are set. Now, dynamic allocation is set to false when
num-executors is explicitly specified as an argument. Consequently,
Repository: spark
Updated Branches:
refs/heads/branch-1.5 6a7582ea2 -> b28295fe0
[SPARK-9907] [SQL] Python crc32 is mistakenly calling md5
Author: Reynold Xin
Closes #8138 from rxin/SPARK-9907.
(cherry picked from commit a17384fa343628cec44437da5b80b9403ecd5838)
Signed-off-by: Reynold Xin
Repository: spark
Updated Branches:
refs/heads/master 6f60298b1 -> a17384fa3
[SPARK-9907] [SQL] Python crc32 is mistakenly calling md5
Author: Reynold Xin
Closes #8138 from rxin/SPARK-9907.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repo
Repository: spark
Updated Branches:
refs/heads/master 551def5d6 -> 6f60298b1
[SPARK-8967] [DOC] add Since annotation
Add `Since` as a Scala annotation. The benefit is that we can use it without
having explicit JavaDoc. This is useful for inherited methods. The limitation
is that is doesn't s
Repository: spark
Updated Branches:
refs/heads/branch-1.5 bdf8dc15d -> 6a7582ea2
[SPARK-8967] [DOC] add Since annotation
Add `Since` as a Scala annotation. The benefit is that we can use it without
having explicit JavaDoc. This is useful for inherited methods. The limitation
is that is doesn
Repository: spark
Updated Branches:
refs/heads/master 762bacc16 -> 551def5d6
[SPARK-9789] [ML] Added logreg threshold param back
Reinstated LogisticRegression.threshold Param for binary compatibility. Param
thresholds overrides threshold, if set.
CC: mengxr dbtsai feynmanliang
Author: Jose
Repository: spark
Updated Branches:
refs/heads/branch-1.5 65b5b2172 -> bdf8dc15d
[SPARK-9789] [ML] Added logreg threshold param back
Reinstated LogisticRegression.threshold Param for binary compatibility. Param
thresholds overrides threshold, if set.
CC: mengxr dbtsai feynmanliang
Author:
Repository: spark
Updated Branches:
refs/heads/master 60103ecd3 -> 762bacc16
[SPARK-9766] [ML] [PySpark] check and add miss docs for PySpark ML
Check and add miss docs for PySpark ML (this issue only check miss docs for
o.a.s.ml not o.a.s.mllib).
Author: Yanbo Liang
Closes #8059 from yanbo
Repository: spark
Updated Branches:
refs/heads/branch-1.5 8629c33b6 -> 65b5b2172
[SPARK-9766] [ML] [PySpark] check and add miss docs for PySpark ML
Check and add miss docs for PySpark ML (this issue only check miss docs for
o.a.s.ml not o.a.s.mllib).
Author: Yanbo Liang
Closes #8059 from y
Repository: spark
Updated Branches:
refs/heads/branch-1.5 b515f890d -> 8629c33b6
[SPARK-9726] [PYTHON] PySpark DF join no longer accepts on=None
rxin
First pull request for Spark so let me know if I am missing anything
The contribution is my original work and I license the work to the project
Repository: spark
Updated Branches:
refs/heads/master 70fe55886 -> 60103ecd3
[SPARK-9726] [PYTHON] PySpark DF join no longer accepts on=None
rxin
First pull request for Spark so let me know if I am missing anything
The contribution is my original work and I license the work to the project
un
Repository: spark
Updated Branches:
refs/heads/branch-1.5 e9641f192 -> b515f890d
[SPARK-9847] [ML] Modified copyValues to distinguish between default, explicit
param values
>From JIRA: Currently, Params.copyValues copies default parameter values to the
>paramMap of the target instance, rathe
Repository: spark
Updated Branches:
refs/heads/master 57ec27dd7 -> 70fe55886
[SPARK-9847] [ML] Modified copyValues to distinguish between default, explicit
param values
>From JIRA: Currently, Params.copyValues copies default parameter values to the
>paramMap of the target instance, rather th
Repository: spark
Updated Branches:
refs/heads/branch-1.5 4c6b1296d -> e9641f192
[SPARK-9804] [HIVE] Use correct value for isSrcLocal parameter.
If the correct parameter is not provided, Hive will run into an error
because it calls methods that are specific to the local filesystem to
copy the
Repository: spark
Updated Branches:
refs/heads/master e0110792e -> 57ec27dd7
[SPARK-9804] [HIVE] Use correct value for isSrcLocal parameter.
If the correct parameter is not provided, Hive will run into an error
because it calls methods that are specific to the local filesystem to
copy the data
Repository: spark
Updated Branches:
refs/heads/master 66d87c1d7 -> e0110792e
[SPARK-9747] [SQL] Avoid starving an unsafe operator in aggregation
This is the sister patch to #8011, but for aggregation.
In a nutshell: create the `TungstenAggregationIterator` before computing the
parent partiti
Repository: spark
Updated Branches:
refs/heads/branch-1.5 2d86faddd -> 4c6b1296d
[SPARK-9747] [SQL] Avoid starving an unsafe operator in aggregation
This is the sister patch to #8011, but for aggregation.
In a nutshell: create the `TungstenAggregationIterator` before computing the
parent par
Repository: spark
Updated Branches:
refs/heads/branch-1.5 bc4ac65d4 -> 2d86faddd
[SPARK-7583] [MLLIB] User guide update for RegexTokenizer
jira: https://issues.apache.org/jira/browse/SPARK-7583
User guide update for RegexTokenizer
Author: Yuhao Yang
Closes #7828 from hhbyyh/regexTokenizerD
Repository: spark
Updated Branches:
refs/heads/master be5d19120 -> 66d87c1d7
[SPARK-7583] [MLLIB] User guide update for RegexTokenizer
jira: https://issues.apache.org/jira/browse/SPARK-7583
User guide update for RegexTokenizer
Author: Yuhao Yang
Closes #7828 from hhbyyh/regexTokenizerDoc.
Repository: spark
Updated Branches:
refs/heads/branch-1.5 0579f28df -> bc4ac65d4
[SPARK-9795] Dynamic allocation: avoid double counting when killing same
executor twice
This is based on KaiXinXiaoLei's changes in #7716.
The issue is that when someone calls `sc.killExecutor("1")` on the same
Repository: spark
Updated Branches:
refs/heads/master 2e680668f -> be5d19120
[SPARK-9795] Dynamic allocation: avoid double counting when killing same
executor twice
This is based on KaiXinXiaoLei's changes in #7716.
The issue is that when someone calls `sc.killExecutor("1")` on the same
exe
Repository: spark
Updated Branches:
refs/heads/branch-1.5 5e6fdc659 -> 0579f28df
[SPARK-8625] [CORE] Propagate user exceptions in tasks back to driver
This allows clients to retrieve the original exception from the
cause field of the SparkException that is thrown by the driver.
If the original
Repository: spark
Updated Branches:
refs/heads/master 3ecb37943 -> 2e680668f
[SPARK-8625] [CORE] Propagate user exceptions in tasks back to driver
This allows clients to retrieve the original exception from the
cause field of the SparkException that is thrown by the driver.
If the original exc
Repository: spark
Updated Branches:
refs/heads/branch-1.5 8e32db9a5 -> 5e6fdc659
[SPARK-9407] [SQL] Relaxes Parquet ValidTypeMap to allow ENUM predicates to be
pushed down
This PR adds a hacky workaround for PARQUET-201, and should be removed once we
upgrade to parquet-mr 1.8.1 or higher ver
Repository: spark
Updated Branches:
refs/heads/master 9d0822455 -> 3ecb37943
[SPARK-9407] [SQL] Relaxes Parquet ValidTypeMap to allow ENUM predicates to be
pushed down
This PR adds a hacky workaround for PARQUET-201, and should be removed once we
upgrade to parquet-mr 1.8.1 or higher version
Repository: spark
Updated Branches:
refs/heads/branch-1.5 5dd0c5cd6 -> 8e32db9a5
[SPARK-9182] [SQL] Filters are not passed through to jdbc source
This PR fixes unable to push filter down to JDBC source caused by `Cast` during
pattern matching.
While we are comparing columns of different type
Repository: spark
Updated Branches:
refs/heads/master 741a29f98 -> 9d0822455
[SPARK-9182] [SQL] Filters are not passed through to jdbc source
This PR fixes unable to push filter down to JDBC source caused by `Cast` during
pattern matching.
While we are comparing columns of different type, th
78 matches
Mail list logo